Apr 20 21:12:30.150318 ip-10-0-129-57 systemd[1]: Starting Kubernetes Kubelet... Apr 20 21:12:30.589376 ip-10-0-129-57 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:12:30.589376 ip-10-0-129-57 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 21:12:30.589376 ip-10-0-129-57 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:12:30.589376 ip-10-0-129-57 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 21:12:30.589376 ip-10-0-129-57 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:12:30.591190 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.591041 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 21:12:30.594032 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594017 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:30.594032 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594032 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594036 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594039 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594042 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594050 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594054 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594057 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594060 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594062 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594067 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594071 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594074 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594077 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594080 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594083 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594086 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594089 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594091 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594094 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:30.594093 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594098 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594100 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594103 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594106 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594109 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594111 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594114 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594116 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594119 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594122 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594125 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594127 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594130 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594132 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594135 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594137 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594140 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594142 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594184 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594223 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:30.594746 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594226 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:30.595360 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594228 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:30.595360 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594231 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:30.595360 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594234 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:30.595360 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594236 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:30.595360 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594239 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:30.595802 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.594244 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:30.596020 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596006 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596022 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596027 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596032 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596038 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596043 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596048 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596052 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596057 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:30.596058 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596061 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596066 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596070 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596075 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596079 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596082 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596086 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596090 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596094 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596098 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596102 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596106 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596109 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596114 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596118 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596122 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596126 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596134 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596139 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:30.596274 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596143 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596147 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596151 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596155 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596159 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596163 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596167 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596170 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596173 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596176 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596178 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596559 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596565 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596568 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596571 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596573 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596576 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596579 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596582 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:30.596720 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596586 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596590 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596593 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596596 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596599 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596601 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596604 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596606 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596609 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596611 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596614 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596616 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596619 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596622 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596624 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596627 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596631 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596635 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596638 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:30.597197 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596641 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596643 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596646 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596649 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596651 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596654 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596656 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596658 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596661 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596663 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596665 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596668 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596670 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596673 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596675 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596677 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596680 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596682 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596684 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596687 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:30.597650 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596689 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596692 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596694 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596697 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596699 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596703 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596706 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596708 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596711 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596714 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596717 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596719 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596722 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596725 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596728 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596731 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596733 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596735 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596738 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596740 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:30.598146 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596743 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596745 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596747 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596750 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596752 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596755 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596757 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596759 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596762 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596764 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596767 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596769 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596771 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596775 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596777 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596780 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596782 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596785 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.596788 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596866 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596877 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 21:12:30.598641 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596887 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596893 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596899 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596905 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596910 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596914 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596917 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596921 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596925 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596928 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596931 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596933 2567 flags.go:64] FLAG: --cgroup-root="" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596936 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596939 2567 flags.go:64] FLAG: --client-ca-file="" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596942 2567 flags.go:64] FLAG: --cloud-config="" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596945 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596948 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596952 2567 flags.go:64] FLAG: --cluster-domain="" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596955 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596958 2567 flags.go:64] FLAG: --config-dir="" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596960 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596964 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596968 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596971 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 21:12:30.599183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596974 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596977 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596980 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.596999 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597005 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597008 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597011 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597016 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597019 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597022 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597025 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597028 2567 flags.go:64] FLAG: --enable-server="true" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597031 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597035 2567 flags.go:64] FLAG: --event-burst="100" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597038 2567 flags.go:64] FLAG: --event-qps="50" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597041 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597044 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597047 2567 flags.go:64] FLAG: --eviction-hard="" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597051 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597054 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597057 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597060 2567 flags.go:64] FLAG: --eviction-soft="" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597062 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597065 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597068 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 21:12:30.599793 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597071 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597074 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597077 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597079 2567 flags.go:64] FLAG: --feature-gates="" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597083 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597086 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597089 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597092 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597095 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597098 2567 flags.go:64] FLAG: --help="false" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597100 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597104 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597107 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597110 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597113 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597117 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597119 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597122 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597125 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597128 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597130 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597133 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597136 2567 flags.go:64] FLAG: --kube-reserved="" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597139 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 21:12:30.600402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597142 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597145 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597148 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597151 2567 flags.go:64] FLAG: --lock-file="" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597153 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597156 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597159 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597164 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597167 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597169 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597172 2567 flags.go:64] FLAG: --logging-format="text" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597175 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597178 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597181 2567 flags.go:64] FLAG: --manifest-url="" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597183 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597188 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597190 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597194 2567 flags.go:64] FLAG: --max-pods="110" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597197 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597205 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597208 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597211 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597215 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597217 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597221 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 21:12:30.600965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597228 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597231 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597234 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597237 2567 flags.go:64] FLAG: --pod-cidr="" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597240 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597245 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597247 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597250 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597254 2567 flags.go:64] FLAG: --port="10250" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597257 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597260 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f50743fec0697a83" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597263 2567 flags.go:64] FLAG: --qos-reserved="" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597266 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597268 2567 flags.go:64] FLAG: --register-node="true" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597271 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597274 2567 flags.go:64] FLAG: --register-with-taints="" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597277 2567 flags.go:64] FLAG: --registry-burst="10" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597280 2567 flags.go:64] FLAG: --registry-qps="5" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597283 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597286 2567 flags.go:64] FLAG: --reserved-memory="" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597289 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597292 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597295 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597298 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597300 2567 flags.go:64] FLAG: --runonce="false" Apr 20 21:12:30.601622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597303 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597306 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597310 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597313 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597316 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597319 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597322 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597325 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597328 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597330 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597333 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597336 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597339 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597341 2567 flags.go:64] FLAG: --system-cgroups="" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597344 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597349 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597352 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597356 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597360 2567 flags.go:64] FLAG: --tls-min-version="" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597363 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597366 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597369 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597371 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597374 2567 flags.go:64] FLAG: --v="2" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597378 2567 flags.go:64] FLAG: --version="false" Apr 20 21:12:30.602237 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597382 2567 flags.go:64] FLAG: --vmodule="" Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597394 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.597397 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597482 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597486 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597489 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597492 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597495 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597498 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597503 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597506 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597509 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597512 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597514 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597517 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597519 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597522 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597524 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597527 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597529 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:30.602862 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597532 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597534 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597537 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597539 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597542 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597545 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597548 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597553 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597556 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597559 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597562 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597565 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597568 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597570 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597573 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597576 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597578 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597580 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597583 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:30.603465 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597586 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597588 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597592 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597597 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597599 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597602 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597605 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597608 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597610 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597613 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597615 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597618 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597620 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597622 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597625 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597627 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597630 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597632 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597635 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597638 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:30.603962 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597640 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597643 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597645 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597648 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597650 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597653 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597656 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597658 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597661 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597663 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597666 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597668 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597671 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597673 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597677 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597683 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597712 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597715 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597718 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597720 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:30.604466 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597723 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597726 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597728 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597731 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597734 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597736 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597739 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597741 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597744 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.597746 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:30.604959 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.598615 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:12:30.606051 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.606032 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 21:12:30.606086 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.606052 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 21:12:30.606116 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606099 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:30.606116 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606104 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:30.606116 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606107 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:30.606116 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606109 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:30.606116 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606112 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:30.606116 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606114 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:30.606116 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606117 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:30.606116 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606121 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606123 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606126 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606129 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606131 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606133 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606136 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606138 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606141 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606145 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606149 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606152 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606155 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606157 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606160 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606162 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606165 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606167 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606170 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606172 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:30.606317 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606175 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606178 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606180 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606183 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606186 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606188 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606191 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606194 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606196 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606198 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606201 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606203 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606205 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606208 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606211 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606213 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606216 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606218 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606221 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:30.606800 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606223 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606226 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606228 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606232 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606235 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606238 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606241 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606243 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606245 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606248 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606251 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606253 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606255 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606258 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606260 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606263 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606265 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606268 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606274 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:30.607387 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606277 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606280 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606282 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606285 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606287 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606290 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606292 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606294 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606297 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606300 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606302 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606305 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606308 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606310 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606313 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606315 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606318 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606320 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606323 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606325 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:30.607837 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606328 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.606333 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606427 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606431 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606434 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606436 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606439 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606441 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606444 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606446 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606449 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606451 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606454 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606457 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606459 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606462 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:30.608337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606464 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606467 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606469 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606472 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606474 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606477 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606480 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606482 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606485 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606487 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606490 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606492 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606495 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606497 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606499 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606502 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606504 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606507 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606509 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606512 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:30.608719 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606514 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606517 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606519 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606522 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606524 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606527 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606529 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606531 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606534 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606537 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606539 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606542 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606544 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606546 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606549 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606551 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606553 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606556 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606558 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:30.609210 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606561 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606564 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606566 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606569 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606571 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606573 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606577 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606580 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606583 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606586 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606588 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606590 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606593 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606595 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606598 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606600 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606602 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606605 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606609 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:30.609679 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606612 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606615 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606618 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606620 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606624 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606626 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606629 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606632 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606634 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606636 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606639 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606642 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606644 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:30.606646 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.606651 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:12:30.610151 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.607288 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 21:12:30.610509 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.609176 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 21:12:30.610509 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.610370 2567 server.go:1019] "Starting client certificate rotation" Apr 20 21:12:30.610509 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.610461 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:12:30.610509 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.610493 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:12:30.635484 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.635466 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:12:30.637834 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.637816 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:12:30.652811 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.652788 2567 log.go:25] "Validated CRI v1 runtime API" Apr 20 21:12:30.660254 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.660238 2567 log.go:25] "Validated CRI v1 image API" Apr 20 21:12:30.661406 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.661386 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 21:12:30.664975 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.664956 2567 fs.go:135] Filesystem UUIDs: map[5d49f78d-a572-452f-8e4d-b3bf5e7f18c1:/dev/nvme0n1p3 5dd6fd2f-f06a-4123-a585-be7cabec3308:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 21:12:30.665064 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.664974 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 21:12:30.667592 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.667569 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:12:30.670789 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.670684 2567 manager.go:217] Machine: {Timestamp:2026-04-20 21:12:30.668704911 +0000 UTC m=+0.397673500 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098692 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec280d18208e598866218fded0464f13 SystemUUID:ec280d18-208e-5988-6621-8fded0464f13 BootID:1dd1c95a-b3f5-48b6-b76b-35ea3bbaedd4 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:58:90:ab:1a:57 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:58:90:ab:1a:57 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:58:9d:44:1a:29 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 21:12:30.670789 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.670785 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 21:12:30.670895 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.670880 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 21:12:30.671808 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.671784 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 21:12:30.671972 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.671811 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-57.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 21:12:30.672029 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.671981 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 21:12:30.672029 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.672006 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 21:12:30.672029 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.672019 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:12:30.672669 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.672659 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:12:30.673780 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.673770 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:12:30.673887 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.673877 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 21:12:30.676609 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.676599 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 20 21:12:30.676643 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.676620 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 21:12:30.676643 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.676632 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 21:12:30.676643 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.676641 2567 kubelet.go:397] "Adding apiserver pod source" Apr 20 21:12:30.676721 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.676660 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 21:12:30.677558 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.677546 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:12:30.677594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.677566 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:12:30.681458 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.681440 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 21:12:30.683249 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.683232 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 21:12:30.684347 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684335 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684352 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684359 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684364 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684371 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684376 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684382 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684387 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684393 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 21:12:30.684400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684399 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 21:12:30.684641 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684408 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 21:12:30.684641 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.684416 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 21:12:30.685230 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.685220 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 21:12:30.685230 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.685230 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 21:12:30.689039 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.689026 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 21:12:30.689090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.689064 2567 server.go:1295] "Started kubelet" Apr 20 21:12:30.689224 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.689188 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 21:12:30.689284 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.689188 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 21:12:30.689284 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.689270 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 21:12:30.689741 ip-10-0-129-57 systemd[1]: Started Kubernetes Kubelet. Apr 20 21:12:30.690301 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.690060 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-57.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 21:12:30.690301 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.690069 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 21:12:30.690301 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.690173 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-57.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 21:12:30.690830 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.690806 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 21:12:30.691197 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.691181 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 20 21:12:30.695292 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.695183 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 21:12:30.695459 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.695437 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 21:12:30.696535 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.696513 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 21:12:30.696535 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.696537 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 21:12:30.696704 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.696688 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 21:12:30.696772 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.696763 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 20 21:12:30.696772 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.696771 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 20 21:12:30.698066 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.698038 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:30.698301 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.698280 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 21:12:30.698301 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.698302 2567 factory.go:55] Registering systemd factory Apr 20 21:12:30.698447 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.698312 2567 factory.go:223] Registration of the systemd container factory successfully Apr 20 21:12:30.698549 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.698528 2567 factory.go:153] Registering CRI-O factory Apr 20 21:12:30.698608 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.698557 2567 factory.go:223] Registration of the crio container factory successfully Apr 20 21:12:30.698661 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.698638 2567 factory.go:103] Registering Raw factory Apr 20 21:12:30.698661 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.698657 2567 manager.go:1196] Started watching for new ooms in manager Apr 20 21:12:30.699231 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.699203 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 21:12:30.699347 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.699331 2567 manager.go:319] Starting recovery of all containers Apr 20 21:12:30.699480 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.699460 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 21:12:30.699551 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.699518 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-57.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 21:12:30.702254 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.699275 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-57.ec2.internal.18a82d04944fe870 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-57.ec2.internal,UID:ip-10-0-129-57.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-57.ec2.internal,},FirstTimestamp:2026-04-20 21:12:30.689044592 +0000 UTC m=+0.418013186,LastTimestamp:2026-04-20 21:12:30.689044592 +0000 UTC m=+0.418013186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-57.ec2.internal,}" Apr 20 21:12:30.712200 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.712185 2567 manager.go:324] Recovery completed Apr 20 21:12:30.714023 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.714007 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qxrpv" Apr 20 21:12:30.715860 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.715848 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:30.718004 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.717972 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:30.718076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.718015 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:30.718076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.718026 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:30.718468 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.718455 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 21:12:30.718468 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.718467 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 21:12:30.718541 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.718481 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:12:30.719540 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.719526 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qxrpv" Apr 20 21:12:30.720147 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.720074 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-57.ec2.internal.18a82d049609c6b1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-57.ec2.internal,UID:ip-10-0-129-57.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-57.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-57.ec2.internal,},FirstTimestamp:2026-04-20 21:12:30.718002865 +0000 UTC m=+0.446971455,LastTimestamp:2026-04-20 21:12:30.718002865 +0000 UTC m=+0.446971455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-57.ec2.internal,}" Apr 20 21:12:30.720595 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.720580 2567 policy_none.go:49] "None policy: Start" Apr 20 21:12:30.720595 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.720601 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 21:12:30.720595 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.720615 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 20 21:12:30.754652 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.754636 2567 manager.go:341] "Starting Device Plugin manager" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.754676 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.754689 2567 server.go:85] "Starting device plugin registration server" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.754938 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.755133 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.755233 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.755309 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.755318 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.755784 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 21:12:30.769089 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.755822 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:30.801189 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.801163 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 21:12:30.802458 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.802439 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 21:12:30.802522 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.802461 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 21:12:30.802522 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.802479 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 21:12:30.802522 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.802485 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 21:12:30.802522 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.802513 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 21:12:30.806018 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.806002 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:12:30.855608 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.855565 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:30.856471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.856458 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:30.856526 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.856484 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:30.856526 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.856493 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:30.856526 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.856513 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.864926 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.864907 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.864926 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.864927 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-57.ec2.internal\": node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:30.881030 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.881008 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:30.903159 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.903114 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal"] Apr 20 21:12:30.903239 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.903208 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:30.904661 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.904646 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:30.904756 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.904677 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:30.904756 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.904688 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:30.905836 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.905823 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:30.905974 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.905958 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.906040 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.906003 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:30.906847 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.906823 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:30.906847 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.906843 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:30.906951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.906868 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:30.906951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.906879 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:30.906951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.906848 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:30.906951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.906934 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:30.907981 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.907964 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.908064 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.908006 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:30.909133 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.909110 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:30.909224 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.909145 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:30.909224 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.909158 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:30.923514 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.923495 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-57.ec2.internal\" not found" node="ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.927826 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.927812 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-57.ec2.internal\" not found" node="ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.981756 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:30.981741 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:30.998348 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.998324 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/45dd152d1fbe0ae39e2b4acf83e7aee8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal\" (UID: \"45dd152d1fbe0ae39e2b4acf83e7aee8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.998441 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.998356 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45dd152d1fbe0ae39e2b4acf83e7aee8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal\" (UID: \"45dd152d1fbe0ae39e2b4acf83e7aee8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:30.998441 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:30.998408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/011bd93cb6528efda482582d85ad698c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-57.ec2.internal\" (UID: \"011bd93cb6528efda482582d85ad698c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.082626 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:31.082592 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:31.098951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.098933 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/011bd93cb6528efda482582d85ad698c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-57.ec2.internal\" (UID: \"011bd93cb6528efda482582d85ad698c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.099026 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.098959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/45dd152d1fbe0ae39e2b4acf83e7aee8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal\" (UID: \"45dd152d1fbe0ae39e2b4acf83e7aee8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.099026 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.098976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45dd152d1fbe0ae39e2b4acf83e7aee8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal\" (UID: \"45dd152d1fbe0ae39e2b4acf83e7aee8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.099026 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.099009 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/011bd93cb6528efda482582d85ad698c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-57.ec2.internal\" (UID: \"011bd93cb6528efda482582d85ad698c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.099145 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.099033 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45dd152d1fbe0ae39e2b4acf83e7aee8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal\" (UID: \"45dd152d1fbe0ae39e2b4acf83e7aee8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.099145 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.099044 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/45dd152d1fbe0ae39e2b4acf83e7aee8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal\" (UID: \"45dd152d1fbe0ae39e2b4acf83e7aee8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.183429 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:31.183361 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:31.224855 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.224818 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.230363 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.230345 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.283892 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:31.283859 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:31.384440 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:31.384397 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:31.485068 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:31.484977 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:31.550893 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.550868 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:12:31.585863 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:31.585836 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-57.ec2.internal\" not found" Apr 20 21:12:31.610304 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.610286 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 21:12:31.610678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.610381 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:12:31.610678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.610436 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:12:31.682360 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.682335 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:12:31.696186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.696167 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 21:12:31.696186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.696172 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.705242 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.705227 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:12:31.708432 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.708420 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:12:31.709925 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.709913 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" Apr 20 21:12:31.718268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.718252 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:12:31.721436 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.721414 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 21:07:30 +0000 UTC" deadline="2027-10-31 02:48:18.645973154 +0000 UTC" Apr 20 21:12:31.721436 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.721435 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13397h35m46.924539862s" Apr 20 21:12:31.726806 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.726788 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rsk49" Apr 20 21:12:31.732586 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.732571 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rsk49" Apr 20 21:12:31.837477 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:31.837446 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45dd152d1fbe0ae39e2b4acf83e7aee8.slice/crio-4cdca7f86c79c62adab12c6aa808a5f888eb3a88116f007822d87c0abb8c4c42 WatchSource:0}: Error finding container 4cdca7f86c79c62adab12c6aa808a5f888eb3a88116f007822d87c0abb8c4c42: Status 404 returned error can't find the container with id 4cdca7f86c79c62adab12c6aa808a5f888eb3a88116f007822d87c0abb8c4c42 Apr 20 21:12:31.838042 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:31.838021 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod011bd93cb6528efda482582d85ad698c.slice/crio-eea810b95771a9d07f032c2dbd6d6f5a27a8669589ea688b1a344fc985cd5774 WatchSource:0}: Error finding container eea810b95771a9d07f032c2dbd6d6f5a27a8669589ea688b1a344fc985cd5774: Status 404 returned error can't find the container with id eea810b95771a9d07f032c2dbd6d6f5a27a8669589ea688b1a344fc985cd5774 Apr 20 21:12:31.841910 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.841896 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:12:31.896321 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:31.896294 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:12:32.677693 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.677657 2567 apiserver.go:52] "Watching apiserver" Apr 20 21:12:32.685281 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.685256 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 21:12:32.685697 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.685671 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wqrxp","openshift-multus/network-metrics-daemon-vc5dw","openshift-network-operator/iptables-alerter-hcq8n","kube-system/global-pull-secret-syncer-blrzp","openshift-cluster-node-tuning-operator/tuned-kp8d9","openshift-dns/node-resolver-qfhjp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal","openshift-multus/multus-fww58","openshift-network-diagnostics/network-check-target-rqns6","openshift-ovn-kubernetes/ovnkube-node-prfjl","kube-system/konnectivity-agent-wl9z5","kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2","openshift-image-registry/node-ca-s6cxs"] Apr 20 21:12:32.688365 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.688342 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.689419 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.689400 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:32.689522 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.689471 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:32.690583 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.690560 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.690762 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.690740 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k9nhc\"" Apr 20 21:12:32.691041 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.690976 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 21:12:32.691277 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.691259 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:12:32.692371 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.692344 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 21:12:32.693726 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.693703 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s7f2k\"" Apr 20 21:12:32.695776 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.694183 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 21:12:32.696280 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.696261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.696413 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.696395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.696515 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.696493 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:32.697951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.697931 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.699701 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.699554 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 21:12:32.699766 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.699754 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fww58" Apr 20 21:12:32.699879 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.699857 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 21:12:32.700235 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.700212 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 21:12:32.700403 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.700383 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-72zks\"" Apr 20 21:12:32.702002 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.701768 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wnhr7\"" Apr 20 21:12:32.702002 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.701805 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 21:12:32.702002 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.701847 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 21:12:32.702002 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.701768 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 21:12:32.703778 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.703402 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 21:12:32.703778 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.703486 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 21:12:32.703778 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.703584 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jfbzw\"" Apr 20 21:12:32.703778 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.703658 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 21:12:32.704221 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.704208 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:32.704368 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.704349 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:32.706977 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.705487 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:32.707081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.706977 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.707491 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707262 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 21:12:32.707571 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707496 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-cnibin\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.707571 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.707571 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-sys\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.707721 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707572 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e22e6563-fb11-436a-86cf-0c8e6a78be42-tmp\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.707721 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707595 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:32.707721 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0b4898da-9e0d-4a11-bec8-8eba5efe7422-hosts-file\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.707721 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707644 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.707721 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-tuned\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.707721 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707692 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d7mx\" (UniqueName: \"kubernetes.io/projected/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-kube-api-access-7d7mx\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707726 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-kubernetes\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707748 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysctl-conf\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707771 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-systemd\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707794 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpwr\" (UniqueName: \"kubernetes.io/projected/0b4898da-9e0d-4a11-bec8-8eba5efe7422-kube-api-access-xhpwr\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707856 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysconfig\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707882 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrcl\" (UniqueName: \"kubernetes.io/projected/e22e6563-fb11-436a-86cf-0c8e6a78be42-kube-api-access-ccrcl\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707904 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-device-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707951 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-etc-selinux\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.708012 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.707973 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b4898da-9e0d-4a11-bec8-8eba5efe7422-tmp-dir\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.708566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.708042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-os-release\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.708566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.708071 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.708566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.708094 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-modprobe-d\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.708566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.708126 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-host\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.708566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.708303 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.708566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.708380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709294 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s9cl2\"" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709456 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-socket-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709468 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709487 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-run\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709512 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709541 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-system-cni-dir\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dksx\" (UniqueName: \"kubernetes.io/projected/089c1db7-01a7-42ee-bf2b-a07303e05826-kube-api-access-7dksx\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709612 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-var-lib-kubelet\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709635 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-744rh\" (UniqueName: \"kubernetes.io/projected/03abd218-9d5d-4f78-9ff1-919c66c5417e-kube-api-access-744rh\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709662 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-sys-fs\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709694 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709686 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62995ee3-d913-46d1-a08a-f154a1b3137d-dbus\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709724 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysctl-d\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709747 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-lib-modules\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709756 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-z6z28\"" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-registration-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.709794 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62995ee3-d913-46d1-a08a-f154a1b3137d-kubelet-config\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.710190 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.710398 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.710591 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 21:12:32.710769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.710628 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 21:12:32.711790 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.710759 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 21:12:32.712471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.712086 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 21:12:32.712471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.712230 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 21:12:32.712471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.712300 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7wb2n\"" Apr 20 21:12:32.712471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.712418 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-p7v8b\"" Apr 20 21:12:32.713405 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.713389 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 21:12:32.717566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.713640 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 21:12:32.717566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.713737 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 21:12:32.717566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.713807 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 21:12:32.733705 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.733638 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:07:31 +0000 UTC" deadline="2027-12-05 12:09:55.983293956 +0000 UTC" Apr 20 21:12:32.733705 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.733664 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14246h57m23.249634049s" Apr 20 21:12:32.797787 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.797765 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 21:12:32.807813 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.807758 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" event={"ID":"45dd152d1fbe0ae39e2b4acf83e7aee8","Type":"ContainerStarted","Data":"4cdca7f86c79c62adab12c6aa808a5f888eb3a88116f007822d87c0abb8c4c42"} Apr 20 21:12:32.808862 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.808839 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" event={"ID":"011bd93cb6528efda482582d85ad698c","Type":"ContainerStarted","Data":"eea810b95771a9d07f032c2dbd6d6f5a27a8669589ea688b1a344fc985cd5774"} Apr 20 21:12:32.810167 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810146 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-node-log\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.810273 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810182 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-env-overrides\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.810273 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810208 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8933c026-e429-44e0-b23f-65580094ed3e-iptables-alerter-script\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.810273 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.810415 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810285 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7b16dac9-24c6-43b4-a23f-0a9c62fb7317-agent-certs\") pod \"konnectivity-agent-wl9z5\" (UID: \"7b16dac9-24c6-43b4-a23f-0a9c62fb7317\") " pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:32.810415 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.810353 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:32.810505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-system-cni-dir\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.810505 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.810448 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret podName:62995ee3-d913-46d1-a08a-f154a1b3137d nodeName:}" failed. No retries permitted until 2026-04-20 21:12:33.310419086 +0000 UTC m=+3.039387664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret") pod "global-pull-secret-syncer-blrzp" (UID: "62995ee3-d913-46d1-a08a-f154a1b3137d") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:32.810505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810461 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-system-cni-dir\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.810505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dksx\" (UniqueName: \"kubernetes.io/projected/089c1db7-01a7-42ee-bf2b-a07303e05826-kube-api-access-7dksx\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.810691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810508 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-744rh\" (UniqueName: \"kubernetes.io/projected/03abd218-9d5d-4f78-9ff1-919c66c5417e-kube-api-access-744rh\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:32.810691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62995ee3-d913-46d1-a08a-f154a1b3137d-dbus\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.810691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810559 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-kubelet\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.810691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810585 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-conf-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.810691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810609 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-ovnkube-script-lib\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.810691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810633 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7b16dac9-24c6-43b4-a23f-0a9c62fb7317-konnectivity-ca\") pod \"konnectivity-agent-wl9z5\" (UID: \"7b16dac9-24c6-43b4-a23f-0a9c62fb7317\") " pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:32.810691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810655 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-netns\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.811014 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-cnibin\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.811014 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810749 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.811014 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810788 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-cnibin\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.811014 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810817 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62995ee3-d913-46d1-a08a-f154a1b3137d-dbus\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.811014 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810834 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-sys\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.811014 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.810968 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e22e6563-fb11-436a-86cf-0c8e6a78be42-tmp\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-sys\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0b4898da-9e0d-4a11-bec8-8eba5efe7422-hosts-file\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811123 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-os-release\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811152 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-hostroot\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-systemd-units\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811202 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-etc-kubernetes\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811207 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0b4898da-9e0d-4a11-bec8-8eba5efe7422-hosts-file\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811250 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh7hf\" (UniqueName: \"kubernetes.io/projected/af220c26-aa9e-4624-b5ca-0581df206506-kube-api-access-mh7hf\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.811283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811278 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-etc-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811311 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811367 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-log-socket\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811392 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-cni-bin\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811408 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811417 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-cni-netd\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-systemd\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-systemd\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrcl\" (UniqueName: \"kubernetes.io/projected/e22e6563-fb11-436a-86cf-0c8e6a78be42-kube-api-access-ccrcl\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-device-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811620 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-etc-selinux\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b4898da-9e0d-4a11-bec8-8eba5efe7422-tmp-dir\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.811678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811670 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/670b3244-d038-4e79-8acc-575b465321dc-ovn-node-metrics-cert\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811729 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-device-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811766 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-etc-selinux\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811766 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs888\" (UniqueName: \"kubernetes.io/projected/670b3244-d038-4e79-8acc-575b465321dc-kube-api-access-qs888\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-modprobe-d\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811838 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-socket-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811862 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-cni-multus\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811884 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-slash\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811907 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-var-lib-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811912 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b4898da-9e0d-4a11-bec8-8eba5efe7422-tmp-dir\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811930 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b85hv\" (UniqueName: \"kubernetes.io/projected/8933c026-e429-44e0-b23f-65580094ed3e-kube-api-access-b85hv\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811941 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-socket-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-modprobe-d\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.811959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-run\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812023 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-run\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b0c9c36-7f31-4319-bc80-862234ec47e6-serviceca\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812065 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-var-lib-kubelet\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.812227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812091 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-sys-fs\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812134 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-ovn\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812150 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-var-lib-kubelet\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812168 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysctl-d\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812140 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-sys-fs\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812218 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-lib-modules\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-registration-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62995ee3-d913-46d1-a08a-f154a1b3137d-kubelet-config\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812281 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysctl-d\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812290 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af220c26-aa9e-4624-b5ca-0581df206506-multus-daemon-config\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812328 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-registration-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812334 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62995ee3-d913-46d1-a08a-f154a1b3137d-kubelet-config\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812382 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-lib-modules\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.812398 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af220c26-aa9e-4624-b5ca-0581df206506-cni-binary-copy\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812439 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-cni-bin\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.813076 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.812446 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs podName:03abd218-9d5d-4f78-9ff1-919c66c5417e nodeName:}" failed. No retries permitted until 2026-04-20 21:12:33.312432032 +0000 UTC m=+3.041400610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs") pod "network-metrics-daemon-vc5dw" (UID: "03abd218-9d5d-4f78-9ff1-919c66c5417e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-systemd\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812497 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b0c9c36-7f31-4319-bc80-862234ec47e6-host\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812551 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-tuned\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812576 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d7mx\" (UniqueName: \"kubernetes.io/projected/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-kube-api-access-7d7mx\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812601 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-system-cni-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-socket-dir-parent\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812675 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-kubelet\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812699 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-run-netns\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812782 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-kubernetes\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysctl-conf\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpwr\" (UniqueName: \"kubernetes.io/projected/0b4898da-9e0d-4a11-bec8-8eba5efe7422-kube-api-access-xhpwr\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-kubernetes\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.812947 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysctl-conf\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-multus-certs\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.813827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813041 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813073 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-ovnkube-config\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813103 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-cni-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813074 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-cnibin\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813151 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-k8s-cni-cncf-io\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813207 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysconfig\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813265 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-sysconfig\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813295 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813327 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8933c026-e429-44e0-b23f-65580094ed3e-host-slash\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813353 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxcnl\" (UniqueName: \"kubernetes.io/projected/3b0c9c36-7f31-4319-bc80-862234ec47e6-kube-api-access-pxcnl\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-os-release\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813407 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813435 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-host\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813470 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-os-release\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.814594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813529 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e22e6563-fb11-436a-86cf-0c8e6a78be42-host\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.815381 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813571 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/089c1db7-01a7-42ee-bf2b-a07303e05826-cni-binary-copy\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.815381 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.813583 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/089c1db7-01a7-42ee-bf2b-a07303e05826-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.815632 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.815614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e22e6563-fb11-436a-86cf-0c8e6a78be42-tmp\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.818038 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.817567 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e22e6563-fb11-436a-86cf-0c8e6a78be42-etc-tuned\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.819203 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.819151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-744rh\" (UniqueName: \"kubernetes.io/projected/03abd218-9d5d-4f78-9ff1-919c66c5417e-kube-api-access-744rh\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:32.820123 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.820031 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dksx\" (UniqueName: \"kubernetes.io/projected/089c1db7-01a7-42ee-bf2b-a07303e05826-kube-api-access-7dksx\") pod \"multus-additional-cni-plugins-wqrxp\" (UID: \"089c1db7-01a7-42ee-bf2b-a07303e05826\") " pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:32.820632 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.820610 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d7mx\" (UniqueName: \"kubernetes.io/projected/f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a-kube-api-access-7d7mx\") pod \"aws-ebs-csi-driver-node-5tnh2\" (UID: \"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:32.821820 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.821802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrcl\" (UniqueName: \"kubernetes.io/projected/e22e6563-fb11-436a-86cf-0c8e6a78be42-kube-api-access-ccrcl\") pod \"tuned-kp8d9\" (UID: \"e22e6563-fb11-436a-86cf-0c8e6a78be42\") " pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:32.822491 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.822475 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpwr\" (UniqueName: \"kubernetes.io/projected/0b4898da-9e0d-4a11-bec8-8eba5efe7422-kube-api-access-xhpwr\") pod \"node-resolver-qfhjp\" (UID: \"0b4898da-9e0d-4a11-bec8-8eba5efe7422\") " pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:32.914492 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914464 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b85hv\" (UniqueName: \"kubernetes.io/projected/8933c026-e429-44e0-b23f-65580094ed3e-kube-api-access-b85hv\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.914492 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b0c9c36-7f31-4319-bc80-862234ec47e6-serviceca\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.914691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-ovn\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.914691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af220c26-aa9e-4624-b5ca-0581df206506-multus-daemon-config\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.914691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914556 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af220c26-aa9e-4624-b5ca-0581df206506-cni-binary-copy\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.914691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914591 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-cni-bin\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.914691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-systemd\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.914691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914638 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b0c9c36-7f31-4319-bc80-862234ec47e6-host\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.914691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914661 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-system-cni-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.914691 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-socket-dir-parent\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-cni-bin\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914711 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-kubelet\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-run-netns\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914754 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-systemd\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-multus-certs\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b0c9c36-7f31-4319-bc80-862234ec47e6-host\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914794 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-socket-dir-parent\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914804 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914835 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-kubelet\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914848 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-system-cni-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-ovnkube-config\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914890 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914938 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-cni-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914977 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-run-netns\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.914893 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-cni-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-cnibin\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915048 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af220c26-aa9e-4624-b5ca-0581df206506-multus-daemon-config\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915052 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-multus-certs\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915055 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-k8s-cni-cncf-io\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8933c026-e429-44e0-b23f-65580094ed3e-host-slash\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxcnl\" (UniqueName: \"kubernetes.io/projected/3b0c9c36-7f31-4319-bc80-862234ec47e6-kube-api-access-pxcnl\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915135 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-k8s-cni-cncf-io\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-node-log\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-env-overrides\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915197 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8933c026-e429-44e0-b23f-65580094ed3e-iptables-alerter-script\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7b16dac9-24c6-43b4-a23f-0a9c62fb7317-agent-certs\") pod \"konnectivity-agent-wl9z5\" (UID: \"7b16dac9-24c6-43b4-a23f-0a9c62fb7317\") " pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915244 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-kubelet\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915260 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-conf-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915277 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-ovnkube-script-lib\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915303 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7b16dac9-24c6-43b4-a23f-0a9c62fb7317-konnectivity-ca\") pod \"konnectivity-agent-wl9z5\" (UID: \"7b16dac9-24c6-43b4-a23f-0a9c62fb7317\") " pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-netns\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915337 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-os-release\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-hostroot\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915367 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-systemd-units\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.915645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915375 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b0c9c36-7f31-4319-bc80-862234ec47e6-serviceca\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915381 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-etc-kubernetes\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915396 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh7hf\" (UniqueName: \"kubernetes.io/projected/af220c26-aa9e-4624-b5ca-0581df206506-kube-api-access-mh7hf\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915397 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af220c26-aa9e-4624-b5ca-0581df206506-cni-binary-copy\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-etc-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915425 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-kubelet\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915456 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915453 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-node-log\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8933c026-e429-44e0-b23f-65580094ed3e-host-slash\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915096 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-cnibin\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915695 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-ovnkube-config\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915178 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-ovn\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915428 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-run-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915750 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-log-socket\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915778 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-cni-bin\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915802 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-cni-netd\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-env-overrides\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915827 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.916506 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915853 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-multus-conf-dir\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/670b3244-d038-4e79-8acc-575b465321dc-ovn-node-metrics-cert\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915887 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs888\" (UniqueName: \"kubernetes.io/projected/670b3244-d038-4e79-8acc-575b465321dc-kube-api-access-qs888\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915905 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/670b3244-d038-4e79-8acc-575b465321dc-ovnkube-script-lib\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-cni-multus\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915941 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-slash\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.915982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-slash\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.916010 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-cni-bin\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.916477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-log-socket\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.916541 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-var-lib-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.916642 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-var-lib-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.916670 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-cni-netd\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.916717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8933c026-e429-44e0-b23f-65580094ed3e-iptables-alerter-script\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.916724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-etc-kubernetes\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.917033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.916733 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-etc-openvswitch\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.917046 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-systemd-units\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.917113 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7b16dac9-24c6-43b4-a23f-0a9c62fb7317-konnectivity-ca\") pod \"konnectivity-agent-wl9z5\" (UID: \"7b16dac9-24c6-43b4-a23f-0a9c62fb7317\") " pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:32.917872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.917166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-os-release\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.917872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.917166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-run-netns\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.917872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.917183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-hostroot\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.917872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.917223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/670b3244-d038-4e79-8acc-575b465321dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.917872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.917273 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af220c26-aa9e-4624-b5ca-0581df206506-host-var-lib-cni-multus\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.919138 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.919116 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/670b3244-d038-4e79-8acc-575b465321dc-ovn-node-metrics-cert\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:32.920276 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.920259 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:12:32.920276 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.920266 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7b16dac9-24c6-43b4-a23f-0a9c62fb7317-agent-certs\") pod \"konnectivity-agent-wl9z5\" (UID: \"7b16dac9-24c6-43b4-a23f-0a9c62fb7317\") " pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:32.920366 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.920285 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:12:32.920366 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.920296 2567 projected.go:194] Error preparing data for projected volume kube-api-access-76fbk for pod openshift-network-diagnostics/network-check-target-rqns6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:32.920366 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:32.920350 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk podName:6fde3cd9-8c1d-4801-8eeb-c3bfd3815846 nodeName:}" failed. No retries permitted until 2026-04-20 21:12:33.420337735 +0000 UTC m=+3.149306328 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-76fbk" (UniqueName: "kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk") pod "network-check-target-rqns6" (UID: "6fde3cd9-8c1d-4801-8eeb-c3bfd3815846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:32.923734 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.923365 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b85hv\" (UniqueName: \"kubernetes.io/projected/8933c026-e429-44e0-b23f-65580094ed3e-kube-api-access-b85hv\") pod \"iptables-alerter-hcq8n\" (UID: \"8933c026-e429-44e0-b23f-65580094ed3e\") " pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:32.923734 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.923680 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxcnl\" (UniqueName: \"kubernetes.io/projected/3b0c9c36-7f31-4319-bc80-862234ec47e6-kube-api-access-pxcnl\") pod \"node-ca-s6cxs\" (UID: \"3b0c9c36-7f31-4319-bc80-862234ec47e6\") " pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:32.924606 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.924530 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh7hf\" (UniqueName: \"kubernetes.io/projected/af220c26-aa9e-4624-b5ca-0581df206506-kube-api-access-mh7hf\") pod \"multus-fww58\" (UID: \"af220c26-aa9e-4624-b5ca-0581df206506\") " pod="openshift-multus/multus-fww58" Apr 20 21:12:32.925137 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:32.925115 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs888\" (UniqueName: \"kubernetes.io/projected/670b3244-d038-4e79-8acc-575b465321dc-kube-api-access-qs888\") pod \"ovnkube-node-prfjl\" (UID: \"670b3244-d038-4e79-8acc-575b465321dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:33.007866 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.007835 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" Apr 20 21:12:33.017934 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.017913 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qfhjp" Apr 20 21:12:33.028644 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.028502 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" Apr 20 21:12:33.035103 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.035083 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" Apr 20 21:12:33.041856 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.041840 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fww58" Apr 20 21:12:33.050375 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.050359 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:33.058863 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.058846 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hcq8n" Apr 20 21:12:33.064299 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.064281 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s6cxs" Apr 20 21:12:33.070829 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.070781 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:33.174885 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.174861 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:12:33.319257 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.319226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:33.319394 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.319277 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:33.319394 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:33.319385 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:33.319468 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:33.319448 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret podName:62995ee3-d913-46d1-a08a-f154a1b3137d nodeName:}" failed. No retries permitted until 2026-04-20 21:12:34.319433527 +0000 UTC m=+4.048402104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret") pod "global-pull-secret-syncer-blrzp" (UID: "62995ee3-d913-46d1-a08a-f154a1b3137d") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:33.319468 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:33.319385 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:33.319549 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:33.319496 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs podName:03abd218-9d5d-4f78-9ff1-919c66c5417e nodeName:}" failed. No retries permitted until 2026-04-20 21:12:34.319486472 +0000 UTC m=+4.048455049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs") pod "network-metrics-daemon-vc5dw" (UID: "03abd218-9d5d-4f78-9ff1-919c66c5417e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:33.520306 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.520243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:33.520397 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:33.520381 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:12:33.520439 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:33.520400 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:12:33.520439 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:33.520409 2567 projected.go:194] Error preparing data for projected volume kube-api-access-76fbk for pod openshift-network-diagnostics/network-check-target-rqns6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:33.520525 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:33.520455 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk podName:6fde3cd9-8c1d-4801-8eeb-c3bfd3815846 nodeName:}" failed. No retries permitted until 2026-04-20 21:12:34.520439439 +0000 UTC m=+4.249408020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-76fbk" (UniqueName: "kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk") pod "network-check-target-rqns6" (UID: "6fde3cd9-8c1d-4801-8eeb-c3bfd3815846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:33.520866 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.520830 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode22e6563_fb11_436a_86cf_0c8e6a78be42.slice/crio-4611f9d67e82fd71dc9863649ef3bfb6c07afe3702db91922d74517cfa3b51b5 WatchSource:0}: Error finding container 4611f9d67e82fd71dc9863649ef3bfb6c07afe3702db91922d74517cfa3b51b5: Status 404 returned error can't find the container with id 4611f9d67e82fd71dc9863649ef3bfb6c07afe3702db91922d74517cfa3b51b5 Apr 20 21:12:33.522459 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.522435 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b0c9c36_7f31_4319_bc80_862234ec47e6.slice/crio-28f7b2d787106e39903197c2554c3d8c10cdd29278d760c514fd44acc6d75802 WatchSource:0}: Error finding container 28f7b2d787106e39903197c2554c3d8c10cdd29278d760c514fd44acc6d75802: Status 404 returned error can't find the container with id 28f7b2d787106e39903197c2554c3d8c10cdd29278d760c514fd44acc6d75802 Apr 20 21:12:33.523231 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.523205 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod089c1db7_01a7_42ee_bf2b_a07303e05826.slice/crio-5d1f23ca1fa9c3a8d1db70fc3eafb04afa70deee749103ff6be47ec765201aa1 WatchSource:0}: Error finding container 5d1f23ca1fa9c3a8d1db70fc3eafb04afa70deee749103ff6be47ec765201aa1: Status 404 returned error can't find the container with id 5d1f23ca1fa9c3a8d1db70fc3eafb04afa70deee749103ff6be47ec765201aa1 Apr 20 21:12:33.524337 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.524307 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b16dac9_24c6_43b4_a23f_0a9c62fb7317.slice/crio-211909001f3fa7e287039a3380d6995ced79b95b840ab3e730e112ddf0469969 WatchSource:0}: Error finding container 211909001f3fa7e287039a3380d6995ced79b95b840ab3e730e112ddf0469969: Status 404 returned error can't find the container with id 211909001f3fa7e287039a3380d6995ced79b95b840ab3e730e112ddf0469969 Apr 20 21:12:33.526331 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.526286 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b4898da_9e0d_4a11_bec8_8eba5efe7422.slice/crio-7db8b4476ffc65a43f4ca9c8dea5adf9cdbc6ac1f4aa339443e45bdea3032d93 WatchSource:0}: Error finding container 7db8b4476ffc65a43f4ca9c8dea5adf9cdbc6ac1f4aa339443e45bdea3032d93: Status 404 returned error can't find the container with id 7db8b4476ffc65a43f4ca9c8dea5adf9cdbc6ac1f4aa339443e45bdea3032d93 Apr 20 21:12:33.528904 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.528885 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4128a58_3d9c_45a6_b9e4_53ce4ddf7e8a.slice/crio-0cc602295f0c4dea5b73611d66c8bde173b12934fe0dc5b47734eea5614f804b WatchSource:0}: Error finding container 0cc602295f0c4dea5b73611d66c8bde173b12934fe0dc5b47734eea5614f804b: Status 404 returned error can't find the container with id 0cc602295f0c4dea5b73611d66c8bde173b12934fe0dc5b47734eea5614f804b Apr 20 21:12:33.529803 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.529783 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf220c26_aa9e_4624_b5ca_0581df206506.slice/crio-2592b172e5a11d3276704a32cac63ee6cddb4702c6a87bbbc646fa01654fbe85 WatchSource:0}: Error finding container 2592b172e5a11d3276704a32cac63ee6cddb4702c6a87bbbc646fa01654fbe85: Status 404 returned error can't find the container with id 2592b172e5a11d3276704a32cac63ee6cddb4702c6a87bbbc646fa01654fbe85 Apr 20 21:12:33.530822 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.530804 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8933c026_e429_44e0_b23f_65580094ed3e.slice/crio-9206c2eed8225c4a5531afe8ced84ac49b1581565218cb3aa4f1e22053c6c665 WatchSource:0}: Error finding container 9206c2eed8225c4a5531afe8ced84ac49b1581565218cb3aa4f1e22053c6c665: Status 404 returned error can't find the container with id 9206c2eed8225c4a5531afe8ced84ac49b1581565218cb3aa4f1e22053c6c665 Apr 20 21:12:33.533463 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:12:33.533439 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod670b3244_d038_4e79_8acc_575b465321dc.slice/crio-54b755f8aac67316e08be9e37128b5e35d88b19006a5316f8b308c3ae407b5f5 WatchSource:0}: Error finding container 54b755f8aac67316e08be9e37128b5e35d88b19006a5316f8b308c3ae407b5f5: Status 404 returned error can't find the container with id 54b755f8aac67316e08be9e37128b5e35d88b19006a5316f8b308c3ae407b5f5 Apr 20 21:12:33.734837 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.734686 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:07:31 +0000 UTC" deadline="2027-12-12 07:19:39.292835393 +0000 UTC" Apr 20 21:12:33.734837 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.734834 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14410h7m5.558006145s" Apr 20 21:12:33.812278 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.812189 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hcq8n" event={"ID":"8933c026-e429-44e0-b23f-65580094ed3e","Type":"ContainerStarted","Data":"9206c2eed8225c4a5531afe8ced84ac49b1581565218cb3aa4f1e22053c6c665"} Apr 20 21:12:33.813057 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.813035 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qfhjp" event={"ID":"0b4898da-9e0d-4a11-bec8-8eba5efe7422","Type":"ContainerStarted","Data":"7db8b4476ffc65a43f4ca9c8dea5adf9cdbc6ac1f4aa339443e45bdea3032d93"} Apr 20 21:12:33.814976 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.814955 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" event={"ID":"011bd93cb6528efda482582d85ad698c","Type":"ContainerStarted","Data":"a7c3c67de6c06f09bd6760717f9afd6a2ccf9446b25afd9168582c0efdadcb1d"} Apr 20 21:12:33.816017 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.815975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fww58" event={"ID":"af220c26-aa9e-4624-b5ca-0581df206506","Type":"ContainerStarted","Data":"2592b172e5a11d3276704a32cac63ee6cddb4702c6a87bbbc646fa01654fbe85"} Apr 20 21:12:33.816784 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.816767 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" event={"ID":"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a","Type":"ContainerStarted","Data":"0cc602295f0c4dea5b73611d66c8bde173b12934fe0dc5b47734eea5614f804b"} Apr 20 21:12:33.818524 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.818502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wl9z5" event={"ID":"7b16dac9-24c6-43b4-a23f-0a9c62fb7317","Type":"ContainerStarted","Data":"211909001f3fa7e287039a3380d6995ced79b95b840ab3e730e112ddf0469969"} Apr 20 21:12:33.822562 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.822530 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" event={"ID":"089c1db7-01a7-42ee-bf2b-a07303e05826","Type":"ContainerStarted","Data":"5d1f23ca1fa9c3a8d1db70fc3eafb04afa70deee749103ff6be47ec765201aa1"} Apr 20 21:12:33.823482 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.823453 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s6cxs" event={"ID":"3b0c9c36-7f31-4319-bc80-862234ec47e6","Type":"ContainerStarted","Data":"28f7b2d787106e39903197c2554c3d8c10cdd29278d760c514fd44acc6d75802"} Apr 20 21:12:33.824408 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.824373 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" event={"ID":"e22e6563-fb11-436a-86cf-0c8e6a78be42","Type":"ContainerStarted","Data":"4611f9d67e82fd71dc9863649ef3bfb6c07afe3702db91922d74517cfa3b51b5"} Apr 20 21:12:33.825278 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.825261 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"54b755f8aac67316e08be9e37128b5e35d88b19006a5316f8b308c3ae407b5f5"} Apr 20 21:12:33.827052 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:33.827008 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-57.ec2.internal" podStartSLOduration=2.8269763599999997 podStartE2EDuration="2.82697636s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:12:33.826759143 +0000 UTC m=+3.555727742" watchObservedRunningTime="2026-04-20 21:12:33.82697636 +0000 UTC m=+3.555944960" Apr 20 21:12:34.326576 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.326538 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:34.326754 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.326608 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:34.326754 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.326727 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:34.326857 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.326786 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs podName:03abd218-9d5d-4f78-9ff1-919c66c5417e nodeName:}" failed. No retries permitted until 2026-04-20 21:12:36.326769495 +0000 UTC m=+6.055738081 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs") pod "network-metrics-daemon-vc5dw" (UID: "03abd218-9d5d-4f78-9ff1-919c66c5417e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:34.327235 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.327194 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:34.327342 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.327246 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret podName:62995ee3-d913-46d1-a08a-f154a1b3137d nodeName:}" failed. No retries permitted until 2026-04-20 21:12:36.327230602 +0000 UTC m=+6.056199187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret") pod "global-pull-secret-syncer-blrzp" (UID: "62995ee3-d913-46d1-a08a-f154a1b3137d") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:34.527615 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.527584 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:34.527750 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.527731 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:12:34.527805 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.527758 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:12:34.527805 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.527771 2567 projected.go:194] Error preparing data for projected volume kube-api-access-76fbk for pod openshift-network-diagnostics/network-check-target-rqns6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:34.527885 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.527826 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk podName:6fde3cd9-8c1d-4801-8eeb-c3bfd3815846 nodeName:}" failed. No retries permitted until 2026-04-20 21:12:36.527808618 +0000 UTC m=+6.256777201 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-76fbk" (UniqueName: "kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk") pod "network-check-target-rqns6" (UID: "6fde3cd9-8c1d-4801-8eeb-c3bfd3815846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:34.726648 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.726557 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:12:34.804569 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.804124 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:34.804569 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.804148 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:34.804569 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.804240 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:34.804569 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.804271 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:34.804569 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.804301 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:34.804569 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:34.804357 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:34.841134 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.841066 2567 generic.go:358] "Generic (PLEG): container finished" podID="45dd152d1fbe0ae39e2b4acf83e7aee8" containerID="b4941930c6cdea1e360b1ea9f7e78ff167694958d35d51e8c583190adb02ee5b" exitCode=0 Apr 20 21:12:34.841588 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:34.841563 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" event={"ID":"45dd152d1fbe0ae39e2b4acf83e7aee8","Type":"ContainerDied","Data":"b4941930c6cdea1e360b1ea9f7e78ff167694958d35d51e8c583190adb02ee5b"} Apr 20 21:12:35.855922 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:35.855254 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" event={"ID":"45dd152d1fbe0ae39e2b4acf83e7aee8","Type":"ContainerStarted","Data":"6d61756c5889f554dad11f9722373debf5ae4d80008334896e5916140c91de16"} Apr 20 21:12:35.868384 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:35.867773 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-57.ec2.internal" podStartSLOduration=4.867755321 podStartE2EDuration="4.867755321s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:12:35.866907685 +0000 UTC m=+5.595876286" watchObservedRunningTime="2026-04-20 21:12:35.867755321 +0000 UTC m=+5.596723920" Apr 20 21:12:36.343018 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:36.342970 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:36.343193 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:36.343057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:36.343193 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.343174 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:36.343319 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.343236 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret podName:62995ee3-d913-46d1-a08a-f154a1b3137d nodeName:}" failed. No retries permitted until 2026-04-20 21:12:40.343218316 +0000 UTC m=+10.072186910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret") pod "global-pull-secret-syncer-blrzp" (UID: "62995ee3-d913-46d1-a08a-f154a1b3137d") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:36.343646 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.343628 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:36.343760 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.343679 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs podName:03abd218-9d5d-4f78-9ff1-919c66c5417e nodeName:}" failed. No retries permitted until 2026-04-20 21:12:40.343664785 +0000 UTC m=+10.072633368 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs") pod "network-metrics-daemon-vc5dw" (UID: "03abd218-9d5d-4f78-9ff1-919c66c5417e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:36.544854 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:36.544761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:36.545038 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.544966 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:12:36.545038 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.545003 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:12:36.545038 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.545016 2567 projected.go:194] Error preparing data for projected volume kube-api-access-76fbk for pod openshift-network-diagnostics/network-check-target-rqns6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:36.545219 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.545072 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk podName:6fde3cd9-8c1d-4801-8eeb-c3bfd3815846 nodeName:}" failed. No retries permitted until 2026-04-20 21:12:40.545054227 +0000 UTC m=+10.274022811 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-76fbk" (UniqueName: "kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk") pod "network-check-target-rqns6" (UID: "6fde3cd9-8c1d-4801-8eeb-c3bfd3815846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:36.807857 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:36.807261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:36.807857 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:36.807324 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:36.807857 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:36.807356 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:36.807857 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.807463 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:36.807857 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.807532 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:36.807857 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:36.807591 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:38.805885 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:38.805855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:38.806353 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:38.806007 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:38.806418 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:38.806404 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:38.806663 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:38.806482 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:38.806663 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:38.806557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:38.806663 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:38.806620 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:40.375594 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:40.375549 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:40.376036 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:40.375636 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:40.376036 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.375744 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:40.376036 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.375749 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:40.376036 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.375806 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret podName:62995ee3-d913-46d1-a08a-f154a1b3137d nodeName:}" failed. No retries permitted until 2026-04-20 21:12:48.375789541 +0000 UTC m=+18.104758120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret") pod "global-pull-secret-syncer-blrzp" (UID: "62995ee3-d913-46d1-a08a-f154a1b3137d") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:40.376036 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.375820 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs podName:03abd218-9d5d-4f78-9ff1-919c66c5417e nodeName:}" failed. No retries permitted until 2026-04-20 21:12:48.375813614 +0000 UTC m=+18.104782190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs") pod "network-metrics-daemon-vc5dw" (UID: "03abd218-9d5d-4f78-9ff1-919c66c5417e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:40.578272 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:40.578233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:40.578443 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.578406 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:12:40.578443 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.578426 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:12:40.578443 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.578438 2567 projected.go:194] Error preparing data for projected volume kube-api-access-76fbk for pod openshift-network-diagnostics/network-check-target-rqns6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:40.578587 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.578492 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk podName:6fde3cd9-8c1d-4801-8eeb-c3bfd3815846 nodeName:}" failed. No retries permitted until 2026-04-20 21:12:48.578474378 +0000 UTC m=+18.307442969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-76fbk" (UniqueName: "kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk") pod "network-check-target-rqns6" (UID: "6fde3cd9-8c1d-4801-8eeb-c3bfd3815846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:40.806011 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:40.804118 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:40.806011 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.804250 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:40.806011 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:40.804616 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:40.806011 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.804707 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:40.806011 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:40.805829 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:40.806011 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:40.805906 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:42.803240 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:42.803194 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:42.803240 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:42.803218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:42.803742 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:42.803319 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:42.803742 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:42.803359 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:42.803742 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:42.803474 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:42.803742 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:42.803507 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:44.803025 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:44.802980 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:44.803498 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:44.802981 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:44.803498 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:44.803095 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:44.803498 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:44.803207 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:44.803498 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:44.803001 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:44.803498 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:44.803302 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:46.803549 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:46.803510 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:46.804032 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:46.803621 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:46.804032 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:46.803631 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:46.804032 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:46.803720 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:46.804032 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:46.803764 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:46.804032 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:46.803833 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:48.436440 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:48.436403 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:48.436892 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:48.436478 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:48.436892 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.436558 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:48.436892 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.436558 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:48.436892 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.436622 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs podName:03abd218-9d5d-4f78-9ff1-919c66c5417e nodeName:}" failed. No retries permitted until 2026-04-20 21:13:04.436602952 +0000 UTC m=+34.165571545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs") pod "network-metrics-daemon-vc5dw" (UID: "03abd218-9d5d-4f78-9ff1-919c66c5417e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:12:48.436892 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.436642 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret podName:62995ee3-d913-46d1-a08a-f154a1b3137d nodeName:}" failed. No retries permitted until 2026-04-20 21:13:04.436632243 +0000 UTC m=+34.165600822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret") pod "global-pull-secret-syncer-blrzp" (UID: "62995ee3-d913-46d1-a08a-f154a1b3137d") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:12:48.637730 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:48.637691 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:48.639222 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.639194 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:12:48.639222 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.639227 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:12:48.639436 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.639242 2567 projected.go:194] Error preparing data for projected volume kube-api-access-76fbk for pod openshift-network-diagnostics/network-check-target-rqns6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:48.639436 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.639322 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk podName:6fde3cd9-8c1d-4801-8eeb-c3bfd3815846 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:04.639303044 +0000 UTC m=+34.368271635 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-76fbk" (UniqueName: "kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk") pod "network-check-target-rqns6" (UID: "6fde3cd9-8c1d-4801-8eeb-c3bfd3815846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:12:48.803611 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:48.803539 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:48.803611 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:48.803583 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:48.803611 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:48.803610 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:48.803857 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.803718 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:48.803907 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.803861 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:48.803980 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:48.803961 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:50.804217 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:50.804175 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:50.804636 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:50.804284 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:50.804679 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:50.804660 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:50.804772 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:50.804749 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:50.804832 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:50.804807 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:50.804901 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:50.804870 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:51.884552 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.884102 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"295ef47dd7d28a09efbab6fdc637c1e731b5c0d54b01ae6c015c68ef792e0e92"} Apr 20 21:12:51.884552 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.884497 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"5f834bd1918490c326ea1d7d498e04286a70f2c6b44fcb36d12f8dae054a2430"} Apr 20 21:12:51.884552 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.884519 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"d6959887e5da1ec0d95a01f7fbefc8fffaf00aa874461308765c1baf46efc5b0"} Apr 20 21:12:51.884552 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.884531 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"556ea189f51f0a9ab1ef03413f5ed98ac8e5240630b1c45c6eed35ad20d8179e"} Apr 20 21:12:51.885433 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.885399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qfhjp" event={"ID":"0b4898da-9e0d-4a11-bec8-8eba5efe7422","Type":"ContainerStarted","Data":"85e86468741659e182b97dec8b97752b41b801f6b717b81ce8b6a2ca5fa8828d"} Apr 20 21:12:51.886776 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.886739 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fww58" event={"ID":"af220c26-aa9e-4624-b5ca-0581df206506","Type":"ContainerStarted","Data":"384c928c55b43d6c6161e6a5c3ff81877d1adda40c0f1aa35bc047544608a050"} Apr 20 21:12:51.888127 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.888108 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" event={"ID":"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a","Type":"ContainerStarted","Data":"efb3dc1d3e3e74d09bc0487f30fe5abf829fccf2849b0d154f7e03bf43f55229"} Apr 20 21:12:51.889504 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.889476 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wl9z5" event={"ID":"7b16dac9-24c6-43b4-a23f-0a9c62fb7317","Type":"ContainerStarted","Data":"334b4348a71d8bb82e594a3bc05ceaa1b1abc264df7b103af6d4211c592a2710"} Apr 20 21:12:51.891010 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.890956 2567 generic.go:358] "Generic (PLEG): container finished" podID="089c1db7-01a7-42ee-bf2b-a07303e05826" containerID="e783bd3d28174c830dab4361618a927996d26e6ca386f8d48fca2fefd4acf171" exitCode=0 Apr 20 21:12:51.891126 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.891029 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" event={"ID":"089c1db7-01a7-42ee-bf2b-a07303e05826","Type":"ContainerDied","Data":"e783bd3d28174c830dab4361618a927996d26e6ca386f8d48fca2fefd4acf171"} Apr 20 21:12:51.894826 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.894784 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s6cxs" event={"ID":"3b0c9c36-7f31-4319-bc80-862234ec47e6","Type":"ContainerStarted","Data":"180b406a77d2db1b541651e4c9a0f330d1c22873f41c384451bce3a4a4c9b4d2"} Apr 20 21:12:51.896431 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.896412 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" event={"ID":"e22e6563-fb11-436a-86cf-0c8e6a78be42","Type":"ContainerStarted","Data":"d0e8f1cfbb9cbf0f76793cd2031f4bccedd0d39696ba60cf141d0005a3d27078"} Apr 20 21:12:51.911378 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.911332 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qfhjp" podStartSLOduration=4.269499719 podStartE2EDuration="21.9113169s" podCreationTimestamp="2026-04-20 21:12:30 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.529037713 +0000 UTC m=+3.258006296" lastFinishedPulling="2026-04-20 21:12:51.170854896 +0000 UTC m=+20.899823477" observedRunningTime="2026-04-20 21:12:51.910935776 +0000 UTC m=+21.639904377" watchObservedRunningTime="2026-04-20 21:12:51.9113169 +0000 UTC m=+21.640285500" Apr 20 21:12:51.924846 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.924801 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wl9z5" podStartSLOduration=3.258108515 podStartE2EDuration="20.924784707s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.526866947 +0000 UTC m=+3.255835530" lastFinishedPulling="2026-04-20 21:12:51.193543142 +0000 UTC m=+20.922511722" observedRunningTime="2026-04-20 21:12:51.924670561 +0000 UTC m=+21.653639161" watchObservedRunningTime="2026-04-20 21:12:51.924784707 +0000 UTC m=+21.653753306" Apr 20 21:12:51.943035 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.942980 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fww58" podStartSLOduration=3.0817517 podStartE2EDuration="20.942968636s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.533703787 +0000 UTC m=+3.262672365" lastFinishedPulling="2026-04-20 21:12:51.394920719 +0000 UTC m=+21.123889301" observedRunningTime="2026-04-20 21:12:51.942760248 +0000 UTC m=+21.671728847" watchObservedRunningTime="2026-04-20 21:12:51.942968636 +0000 UTC m=+21.671937235" Apr 20 21:12:51.956501 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.956462 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-s6cxs" podStartSLOduration=6.945992785 podStartE2EDuration="20.956451212s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.525137109 +0000 UTC m=+3.254105693" lastFinishedPulling="2026-04-20 21:12:47.535595542 +0000 UTC m=+17.264564120" observedRunningTime="2026-04-20 21:12:51.9558977 +0000 UTC m=+21.684866301" watchObservedRunningTime="2026-04-20 21:12:51.956451212 +0000 UTC m=+21.685419811" Apr 20 21:12:51.972425 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:51.972378 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kp8d9" podStartSLOduration=4.139352221 podStartE2EDuration="21.972362727s" podCreationTimestamp="2026-04-20 21:12:30 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.522962916 +0000 UTC m=+3.251931494" lastFinishedPulling="2026-04-20 21:12:51.355973227 +0000 UTC m=+21.084942000" observedRunningTime="2026-04-20 21:12:51.972162716 +0000 UTC m=+21.701131316" watchObservedRunningTime="2026-04-20 21:12:51.972362727 +0000 UTC m=+21.701331330" Apr 20 21:12:52.590878 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.590770 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 21:12:52.766574 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.766378 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T21:12:52.590790327Z","UUID":"f5a151fa-fd79-4e0d-8acb-f030e3a8d60a","Handler":null,"Name":"","Endpoint":""} Apr 20 21:12:52.768171 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.768150 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 21:12:52.768310 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.768177 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 21:12:52.803938 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.803383 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:52.803938 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:52.803503 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:52.803938 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.803811 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:52.803938 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:52.803900 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:52.804160 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.804113 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:52.804261 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:52.804241 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:52.900180 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.900150 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" event={"ID":"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a","Type":"ContainerStarted","Data":"0981d68ac5a3b9285457d01fd6be6c1bbf2d37789c14f73232f7cbe860b692ba"} Apr 20 21:12:52.903041 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.903011 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"1c48b1554ee4da6daecf07271d0b4e9ecfaff3ebf692e7bd4573c137a2ad6054"} Apr 20 21:12:52.903152 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.903048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"4dd92ef9e03bb60361c8ebca6ca759075c237b312f5bcfc864eac1a4facc873a"} Apr 20 21:12:52.904483 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.904235 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hcq8n" event={"ID":"8933c026-e429-44e0-b23f-65580094ed3e","Type":"ContainerStarted","Data":"7a0a4710b3b8fbb8f577d2dd0fb1a74e6395251c175de378da220c8266777710"} Apr 20 21:12:52.917463 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:52.917422 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hcq8n" podStartSLOduration=4.098694684 podStartE2EDuration="21.917408811s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.535050759 +0000 UTC m=+3.264019346" lastFinishedPulling="2026-04-20 21:12:51.353764882 +0000 UTC m=+21.082733473" observedRunningTime="2026-04-20 21:12:52.917386272 +0000 UTC m=+22.646354872" watchObservedRunningTime="2026-04-20 21:12:52.917408811 +0000 UTC m=+22.646377783" Apr 20 21:12:53.907827 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:53.907792 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" event={"ID":"f4128a58-3d9c-45a6-b9e4-53ce4ddf7e8a","Type":"ContainerStarted","Data":"715ec2378268eedc4418650cff502bd1895a72a29c3cb2b33b48f7efc48dc2cc"} Apr 20 21:12:53.923457 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:53.923417 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5tnh2" podStartSLOduration=4.050808655 podStartE2EDuration="23.923402769s" podCreationTimestamp="2026-04-20 21:12:30 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.530442295 +0000 UTC m=+3.259410878" lastFinishedPulling="2026-04-20 21:12:53.403036398 +0000 UTC m=+23.132004992" observedRunningTime="2026-04-20 21:12:53.923029728 +0000 UTC m=+23.651998328" watchObservedRunningTime="2026-04-20 21:12:53.923402769 +0000 UTC m=+23.652371368" Apr 20 21:12:54.803044 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:54.803009 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:54.803229 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:54.803009 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:54.803229 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:54.803143 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:54.803229 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:54.803196 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:54.803405 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:54.803303 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:54.803461 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:54.803407 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:54.914624 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:54.914585 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"1ae00cb7dc98221c1ce72c28ebbd43c50339a22bda78d5df54af977f43a71d92"} Apr 20 21:12:56.371427 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.371110 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:56.372606 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.371706 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:56.802910 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.802833 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:56.802910 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.802850 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:56.803175 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:56.802938 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:56.803175 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.803019 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:56.803175 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:56.803026 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:56.803175 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:56.803082 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:56.920498 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.920468 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" event={"ID":"670b3244-d038-4e79-8acc-575b465321dc","Type":"ContainerStarted","Data":"2b07896cc3e94d8efc0c8dd5520f9019b535cf0a5c61725da9262cba32a6ce9c"} Apr 20 21:12:56.921258 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.920796 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:56.921258 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.920825 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:56.922092 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.922071 2567 generic.go:358] "Generic (PLEG): container finished" podID="089c1db7-01a7-42ee-bf2b-a07303e05826" containerID="f7a5d51064e6ba7bcbe9be917439936bfa9e2a7cf88d5103b85eb42033f401b5" exitCode=0 Apr 20 21:12:56.922179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.922131 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" event={"ID":"089c1db7-01a7-42ee-bf2b-a07303e05826","Type":"ContainerDied","Data":"f7a5d51064e6ba7bcbe9be917439936bfa9e2a7cf88d5103b85eb42033f401b5"} Apr 20 21:12:56.922415 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.922387 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:56.922873 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.922813 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wl9z5" Apr 20 21:12:56.935215 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.935198 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:56.950488 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:56.950445 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" podStartSLOduration=7.93574351 podStartE2EDuration="25.950433993s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.536471916 +0000 UTC m=+3.265440493" lastFinishedPulling="2026-04-20 21:12:51.551162397 +0000 UTC m=+21.280130976" observedRunningTime="2026-04-20 21:12:56.950122413 +0000 UTC m=+26.679091012" watchObservedRunningTime="2026-04-20 21:12:56.950433993 +0000 UTC m=+26.679402591" Apr 20 21:12:57.926667 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:57.926388 2567 generic.go:358] "Generic (PLEG): container finished" podID="089c1db7-01a7-42ee-bf2b-a07303e05826" containerID="03560f81e551dccf273702c38f8df2be57dbce61f0e0786d95a445552d7dfe8b" exitCode=0 Apr 20 21:12:57.927296 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:57.926465 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" event={"ID":"089c1db7-01a7-42ee-bf2b-a07303e05826","Type":"ContainerDied","Data":"03560f81e551dccf273702c38f8df2be57dbce61f0e0786d95a445552d7dfe8b"} Apr 20 21:12:57.927505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:57.927439 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:57.952505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:57.952466 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:12:58.202639 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:58.202564 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rqns6"] Apr 20 21:12:58.202769 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:58.202693 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:58.202836 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:58.202816 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:58.205928 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:58.205905 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vc5dw"] Apr 20 21:12:58.206056 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:58.206047 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:58.206161 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:58.206145 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:12:58.206707 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:58.206688 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-blrzp"] Apr 20 21:12:58.206796 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:58.206777 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:58.206915 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:58.206883 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:58.930151 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:58.930065 2567 generic.go:358] "Generic (PLEG): container finished" podID="089c1db7-01a7-42ee-bf2b-a07303e05826" containerID="4fc45e60419d9cd6dfd12bf7c766f8f22fc89ee087a4bdefc9ce5c323be298a2" exitCode=0 Apr 20 21:12:58.930487 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:58.930149 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" event={"ID":"089c1db7-01a7-42ee-bf2b-a07303e05826","Type":"ContainerDied","Data":"4fc45e60419d9cd6dfd12bf7c766f8f22fc89ee087a4bdefc9ce5c323be298a2"} Apr 20 21:12:59.803673 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:59.803626 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:12:59.803673 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:59.803651 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:12:59.803940 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:59.803744 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:12:59.803940 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:59.803840 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:12:59.803940 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:12:59.803880 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:12:59.804062 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:12:59.803979 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:13:01.803015 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:01.802970 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:13:01.803505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:01.803030 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:13:01.803505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:01.802970 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:13:01.803505 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:01.803102 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:13:01.803505 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:01.803184 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:13:01.803505 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:01.803275 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:13:03.802758 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:03.802727 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:13:03.803322 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:03.802776 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:13:03.803322 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:03.802829 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:13:03.803322 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:03.802945 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-blrzp" podUID="62995ee3-d913-46d1-a08a-f154a1b3137d" Apr 20 21:13:03.803322 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:03.803044 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vc5dw" podUID="03abd218-9d5d-4f78-9ff1-919c66c5417e" Apr 20 21:13:03.803322 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:03.803117 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rqns6" podUID="6fde3cd9-8c1d-4801-8eeb-c3bfd3815846" Apr 20 21:13:04.459289 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.459200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:13:04.459289 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.459271 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:13:04.459496 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.459375 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:04.459496 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.459376 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:04.459496 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.459439 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs podName:03abd218-9d5d-4f78-9ff1-919c66c5417e nodeName:}" failed. No retries permitted until 2026-04-20 21:13:36.459421712 +0000 UTC m=+66.188390290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs") pod "network-metrics-daemon-vc5dw" (UID: "03abd218-9d5d-4f78-9ff1-919c66c5417e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:04.459496 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.459453 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret podName:62995ee3-d913-46d1-a08a-f154a1b3137d nodeName:}" failed. No retries permitted until 2026-04-20 21:13:36.459447074 +0000 UTC m=+66.188415651 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret") pod "global-pull-secret-syncer-blrzp" (UID: "62995ee3-d913-46d1-a08a-f154a1b3137d") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:04.591589 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.591553 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-57.ec2.internal" event="NodeReady" Apr 20 21:13:04.591727 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.591668 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 21:13:04.625443 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.625421 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c7f65c64-sjpd5"] Apr 20 21:13:04.639545 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.639528 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.639850 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.639825 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cnvwm"] Apr 20 21:13:04.644758 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.644734 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4cm2d\"" Apr 20 21:13:04.644966 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.644938 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 21:13:04.645316 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.645296 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 21:13:04.648203 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.648182 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 21:13:04.655887 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.655867 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 21:13:04.661000 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.660964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:13:04.661183 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.661146 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:04.661183 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.661173 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:04.661183 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.661186 2567 projected.go:194] Error preparing data for projected volume kube-api-access-76fbk for pod openshift-network-diagnostics/network-check-target-rqns6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:04.661393 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.661265 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk podName:6fde3cd9-8c1d-4801-8eeb-c3bfd3815846 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:36.661224605 +0000 UTC m=+66.390193197 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-76fbk" (UniqueName: "kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk") pod "network-check-target-rqns6" (UID: "6fde3cd9-8c1d-4801-8eeb-c3bfd3815846") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:04.664677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.664658 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c7f65c64-sjpd5"] Apr 20 21:13:04.664750 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.664686 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mm5c7"] Apr 20 21:13:04.664786 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.664756 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.667124 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.667098 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 21:13:04.667225 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.667145 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtslr\"" Apr 20 21:13:04.667623 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.667606 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 21:13:04.677110 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.677091 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mm5c7"] Apr 20 21:13:04.677110 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.677113 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cnvwm"] Apr 20 21:13:04.677222 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.677189 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:04.679816 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.679800 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 21:13:04.680045 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.680028 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 21:13:04.680114 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.680061 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 21:13:04.680114 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.680091 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kkzr9\"" Apr 20 21:13:04.762081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762028 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-config-volume\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.762081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-tmp-dir\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.762081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762080 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf9p2\" (UniqueName: \"kubernetes.io/projected/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-kube-api-access-pf9p2\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.762251 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762105 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-image-registry-private-configuration\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.762251 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762130 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.762251 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762150 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-certificates\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.762251 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762166 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-trusted-ca\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.762251 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762181 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-bound-sa-token\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.762251 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762196 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:04.762251 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762210 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff424\" (UniqueName: \"kubernetes.io/projected/dc6985aa-5589-44b0-97f0-b862837c4008-kube-api-access-ff424\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:04.762487 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762267 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-ca-trust-extracted\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.762487 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-installation-pull-secrets\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.762487 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762333 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmw6d\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-kube-api-access-tmw6d\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.762487 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.762354 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.862883 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.862859 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-image-registry-private-configuration\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.862887 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.862911 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-certificates\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.862928 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-trusted-ca\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.862946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-bound-sa-token\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.863035 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.863053 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7f65c64-sjpd5: secret "image-registry-tls" not found Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.863107 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls podName:c9a1fa00-f1f8-4c50-a5de-e17cf56544f7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:05.363086477 +0000 UTC m=+35.092055055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls") pod "image-registry-5c7f65c64-sjpd5" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7") : secret "image-registry-tls" not found Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863131 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ff424\" (UniqueName: \"kubernetes.io/projected/dc6985aa-5589-44b0-97f0-b862837c4008-kube-api-access-ff424\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.863168 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863184 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-ca-trust-extracted\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863268 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.863225 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert podName:dc6985aa-5589-44b0-97f0-b862837c4008 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:05.363198718 +0000 UTC m=+35.092167294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert") pod "ingress-canary-mm5c7" (UID: "dc6985aa-5589-44b0-97f0-b862837c4008") : secret "canary-serving-cert" not found Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863285 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-installation-pull-secrets\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmw6d\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-kube-api-access-tmw6d\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863383 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-config-volume\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863408 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-tmp-dir\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf9p2\" (UniqueName: \"kubernetes.io/projected/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-kube-api-access-pf9p2\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863545 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-certificates\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-ca-trust-extracted\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.863660 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:04.863886 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:04.863718 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls podName:770298fa-c6f6-4828-8681-c2a0e5ebc1b5 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:05.363703426 +0000 UTC m=+35.092672021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls") pod "dns-default-cnvwm" (UID: "770298fa-c6f6-4828-8681-c2a0e5ebc1b5") : secret "dns-default-metrics-tls" not found Apr 20 21:13:04.864228 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863937 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-tmp-dir\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.864228 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.863944 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-trusted-ca\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.864228 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.864088 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-config-volume\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.867010 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.866976 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-image-registry-private-configuration\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.867062 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.867020 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-installation-pull-secrets\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.871951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.871908 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-bound-sa-token\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.872157 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.872132 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf9p2\" (UniqueName: \"kubernetes.io/projected/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-kube-api-access-pf9p2\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:04.872290 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.872233 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmw6d\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-kube-api-access-tmw6d\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:04.872628 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:04.872608 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff424\" (UniqueName: \"kubernetes.io/projected/dc6985aa-5589-44b0-97f0-b862837c4008-kube-api-access-ff424\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:05.368782 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.368587 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:05.368888 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.368789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:05.368888 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:05.368713 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:05.368888 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:05.368884 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:13:05.369006 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:05.368893 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7f65c64-sjpd5: secret "image-registry-tls" not found Apr 20 21:13:05.369006 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:05.368906 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:05.369006 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:05.368884 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert podName:dc6985aa-5589-44b0-97f0-b862837c4008 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:06.36886778 +0000 UTC m=+36.097836357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert") pod "ingress-canary-mm5c7" (UID: "dc6985aa-5589-44b0-97f0-b862837c4008") : secret "canary-serving-cert" not found Apr 20 21:13:05.369006 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.368823 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:05.369006 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:05.368941 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls podName:c9a1fa00-f1f8-4c50-a5de-e17cf56544f7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:06.368927269 +0000 UTC m=+36.097895847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls") pod "image-registry-5c7f65c64-sjpd5" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7") : secret "image-registry-tls" not found Apr 20 21:13:05.369006 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:05.368955 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls podName:770298fa-c6f6-4828-8681-c2a0e5ebc1b5 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:06.368948817 +0000 UTC m=+36.097917394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls") pod "dns-default-cnvwm" (UID: "770298fa-c6f6-4828-8681-c2a0e5ebc1b5") : secret "dns-default-metrics-tls" not found Apr 20 21:13:05.803440 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.803357 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:13:05.803579 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.803482 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:13:05.803675 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.803493 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:13:05.805830 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.805812 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 21:13:05.806679 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.806646 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:13:05.806786 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.806690 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:13:05.806786 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.806706 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs7p8\"" Apr 20 21:13:05.806786 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.806710 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lxd5q\"" Apr 20 21:13:05.806786 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.806756 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:13:05.945620 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.945593 2567 generic.go:358] "Generic (PLEG): container finished" podID="089c1db7-01a7-42ee-bf2b-a07303e05826" containerID="c26c4deaca2bc2438526b0a05b83459c80af792a31418042b4cf63ec0025f04c" exitCode=0 Apr 20 21:13:05.946025 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:05.945649 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" event={"ID":"089c1db7-01a7-42ee-bf2b-a07303e05826","Type":"ContainerDied","Data":"c26c4deaca2bc2438526b0a05b83459c80af792a31418042b4cf63ec0025f04c"} Apr 20 21:13:06.377557 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:06.377503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:06.377557 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:06.377543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:06.377696 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:06.377562 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:06.377696 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:06.377658 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:06.377696 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:06.377658 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:06.377816 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:06.377662 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:13:06.377816 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:06.377710 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert podName:dc6985aa-5589-44b0-97f0-b862837c4008 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:08.377696216 +0000 UTC m=+38.106664794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert") pod "ingress-canary-mm5c7" (UID: "dc6985aa-5589-44b0-97f0-b862837c4008") : secret "canary-serving-cert" not found Apr 20 21:13:06.377816 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:06.377710 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7f65c64-sjpd5: secret "image-registry-tls" not found Apr 20 21:13:06.377816 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:06.377723 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls podName:770298fa-c6f6-4828-8681-c2a0e5ebc1b5 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:08.377717235 +0000 UTC m=+38.106685813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls") pod "dns-default-cnvwm" (UID: "770298fa-c6f6-4828-8681-c2a0e5ebc1b5") : secret "dns-default-metrics-tls" not found Apr 20 21:13:06.377816 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:06.377742 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls podName:c9a1fa00-f1f8-4c50-a5de-e17cf56544f7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:08.377728199 +0000 UTC m=+38.106696775 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls") pod "image-registry-5c7f65c64-sjpd5" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7") : secret "image-registry-tls" not found Apr 20 21:13:06.949908 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:06.949873 2567 generic.go:358] "Generic (PLEG): container finished" podID="089c1db7-01a7-42ee-bf2b-a07303e05826" containerID="0c758b3a748a1131a6320d6676021ce5357caf40a802df9e0f0a1fe4f1fe3164" exitCode=0 Apr 20 21:13:06.950451 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:06.949931 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" event={"ID":"089c1db7-01a7-42ee-bf2b-a07303e05826","Type":"ContainerDied","Data":"0c758b3a748a1131a6320d6676021ce5357caf40a802df9e0f0a1fe4f1fe3164"} Apr 20 21:13:07.956580 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:07.956550 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" event={"ID":"089c1db7-01a7-42ee-bf2b-a07303e05826","Type":"ContainerStarted","Data":"41ff04e6b590e8f3cae08e38deca52a35acb34603a23130a9b671ee9795c9567"} Apr 20 21:13:07.979259 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:07.979180 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wqrxp" podStartSLOduration=5.606181818 podStartE2EDuration="36.979160929s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="2026-04-20 21:12:33.526348262 +0000 UTC m=+3.255316847" lastFinishedPulling="2026-04-20 21:13:04.899327381 +0000 UTC m=+34.628295958" observedRunningTime="2026-04-20 21:13:07.977703976 +0000 UTC m=+37.706672573" watchObservedRunningTime="2026-04-20 21:13:07.979160929 +0000 UTC m=+37.708129531" Apr 20 21:13:08.389887 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:08.389860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:08.390080 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:08.389894 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:08.390080 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:08.390007 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:08.390080 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:08.390028 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:13:08.390080 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:08.390048 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7f65c64-sjpd5: secret "image-registry-tls" not found Apr 20 21:13:08.390261 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:08.390043 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:08.390261 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:08.390051 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert podName:dc6985aa-5589-44b0-97f0-b862837c4008 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:12.390038925 +0000 UTC m=+42.119007502 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert") pod "ingress-canary-mm5c7" (UID: "dc6985aa-5589-44b0-97f0-b862837c4008") : secret "canary-serving-cert" not found Apr 20 21:13:08.390261 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:08.390095 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:08.390261 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:08.390120 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls podName:c9a1fa00-f1f8-4c50-a5de-e17cf56544f7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:12.390102543 +0000 UTC m=+42.119071127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls") pod "image-registry-5c7f65c64-sjpd5" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7") : secret "image-registry-tls" not found Apr 20 21:13:08.390261 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:08.390138 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls podName:770298fa-c6f6-4828-8681-c2a0e5ebc1b5 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:12.390128772 +0000 UTC m=+42.119097351 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls") pod "dns-default-cnvwm" (UID: "770298fa-c6f6-4828-8681-c2a0e5ebc1b5") : secret "dns-default-metrics-tls" not found Apr 20 21:13:12.416464 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:12.416429 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:12.416464 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:12.416479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:12.416906 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:12.416502 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:12.416906 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:12.416607 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:12.416906 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:12.416617 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:12.416906 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:12.416610 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:13:12.416906 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:12.416673 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert podName:dc6985aa-5589-44b0-97f0-b862837c4008 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:20.4166599 +0000 UTC m=+50.145628476 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert") pod "ingress-canary-mm5c7" (UID: "dc6985aa-5589-44b0-97f0-b862837c4008") : secret "canary-serving-cert" not found Apr 20 21:13:12.416906 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:12.416676 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7f65c64-sjpd5: secret "image-registry-tls" not found Apr 20 21:13:12.416906 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:12.416688 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls podName:770298fa-c6f6-4828-8681-c2a0e5ebc1b5 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:20.416682165 +0000 UTC m=+50.145650741 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls") pod "dns-default-cnvwm" (UID: "770298fa-c6f6-4828-8681-c2a0e5ebc1b5") : secret "dns-default-metrics-tls" not found Apr 20 21:13:12.416906 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:12.416712 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls podName:c9a1fa00-f1f8-4c50-a5de-e17cf56544f7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:20.416693037 +0000 UTC m=+50.145661617 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls") pod "image-registry-5c7f65c64-sjpd5" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7") : secret "image-registry-tls" not found Apr 20 21:13:20.468204 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:20.468165 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:20.468213 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:20.468285 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:20.468321 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:20.468338 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7f65c64-sjpd5: secret "image-registry-tls" not found Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:20.468405 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls podName:c9a1fa00-f1f8-4c50-a5de-e17cf56544f7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:36.468383162 +0000 UTC m=+66.197351758 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls") pod "image-registry-5c7f65c64-sjpd5" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7") : secret "image-registry-tls" not found Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:20.468410 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:20.468413 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:20.468452 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls podName:770298fa-c6f6-4828-8681-c2a0e5ebc1b5 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:36.468441666 +0000 UTC m=+66.197410243 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls") pod "dns-default-cnvwm" (UID: "770298fa-c6f6-4828-8681-c2a0e5ebc1b5") : secret "dns-default-metrics-tls" not found Apr 20 21:13:20.468629 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:20.468491 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert podName:dc6985aa-5589-44b0-97f0-b862837c4008 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:36.468471355 +0000 UTC m=+66.197439943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert") pod "ingress-canary-mm5c7" (UID: "dc6985aa-5589-44b0-97f0-b862837c4008") : secret "canary-serving-cert" not found Apr 20 21:13:29.943696 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:29.943658 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prfjl" Apr 20 21:13:36.478061 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.478018 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.478083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.478102 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.478114 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.478175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.478202 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.478118 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7f65c64-sjpd5: secret "image-registry-tls" not found Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.478298 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.478171 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.478316 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls podName:c9a1fa00-f1f8-4c50-a5de-e17cf56544f7 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:08.478293808 +0000 UTC m=+98.207262385 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls") pod "image-registry-5c7f65c64-sjpd5" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7") : secret "image-registry-tls" not found Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.478409 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls podName:770298fa-c6f6-4828-8681-c2a0e5ebc1b5 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:08.478385865 +0000 UTC m=+98.207354441 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls") pod "dns-default-cnvwm" (UID: "770298fa-c6f6-4828-8681-c2a0e5ebc1b5") : secret "dns-default-metrics-tls" not found Apr 20 21:13:36.478465 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.478427 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert podName:dc6985aa-5589-44b0-97f0-b862837c4008 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:08.478416978 +0000 UTC m=+98.207385555 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert") pod "ingress-canary-mm5c7" (UID: "dc6985aa-5589-44b0-97f0-b862837c4008") : secret "canary-serving-cert" not found Apr 20 21:13:36.480536 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.480513 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 21:13:36.480536 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.480528 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:13:36.488698 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.488678 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 21:13:36.488744 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:13:36.488732 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs podName:03abd218-9d5d-4f78-9ff1-919c66c5417e nodeName:}" failed. No retries permitted until 2026-04-20 21:14:40.488719565 +0000 UTC m=+130.217688141 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs") pod "network-metrics-daemon-vc5dw" (UID: "03abd218-9d5d-4f78-9ff1-919c66c5417e") : secret "metrics-daemon-secret" not found Apr 20 21:13:36.491382 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.491362 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62995ee3-d913-46d1-a08a-f154a1b3137d-original-pull-secret\") pod \"global-pull-secret-syncer-blrzp\" (UID: \"62995ee3-d913-46d1-a08a-f154a1b3137d\") " pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:13:36.679178 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.679148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:13:36.681497 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.681483 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:13:36.691998 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.691972 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:13:36.703143 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.703123 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76fbk\" (UniqueName: \"kubernetes.io/projected/6fde3cd9-8c1d-4801-8eeb-c3bfd3815846-kube-api-access-76fbk\") pod \"network-check-target-rqns6\" (UID: \"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846\") " pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:13:36.712965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.712945 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-blrzp" Apr 20 21:13:36.724477 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.724456 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs7p8\"" Apr 20 21:13:36.733100 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.733046 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:13:36.854961 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.854912 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-blrzp"] Apr 20 21:13:36.858824 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:13:36.858797 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62995ee3_d913_46d1_a08a_f154a1b3137d.slice/crio-7fd03ad6a2d1f335ca42f04f1245e3b123f0a253be4af9b713a1565c3421125b WatchSource:0}: Error finding container 7fd03ad6a2d1f335ca42f04f1245e3b123f0a253be4af9b713a1565c3421125b: Status 404 returned error can't find the container with id 7fd03ad6a2d1f335ca42f04f1245e3b123f0a253be4af9b713a1565c3421125b Apr 20 21:13:36.874890 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:36.874870 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rqns6"] Apr 20 21:13:36.877755 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:13:36.877733 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fde3cd9_8c1d_4801_8eeb_c3bfd3815846.slice/crio-5505194737b6cd980049915af34565feded5f3e543cc4e38f30ce0275da0f35a WatchSource:0}: Error finding container 5505194737b6cd980049915af34565feded5f3e543cc4e38f30ce0275da0f35a: Status 404 returned error can't find the container with id 5505194737b6cd980049915af34565feded5f3e543cc4e38f30ce0275da0f35a Apr 20 21:13:37.007571 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:37.007499 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-blrzp" event={"ID":"62995ee3-d913-46d1-a08a-f154a1b3137d","Type":"ContainerStarted","Data":"7fd03ad6a2d1f335ca42f04f1245e3b123f0a253be4af9b713a1565c3421125b"} Apr 20 21:13:37.008370 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:37.008349 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rqns6" event={"ID":"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846","Type":"ContainerStarted","Data":"5505194737b6cd980049915af34565feded5f3e543cc4e38f30ce0275da0f35a"} Apr 20 21:13:40.015972 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:40.015898 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rqns6" event={"ID":"6fde3cd9-8c1d-4801-8eeb-c3bfd3815846","Type":"ContainerStarted","Data":"cfd322efefcf6c4b5792435d3fd06bc3f6cb03b9184d35f41ebdaadc58613a8d"} Apr 20 21:13:40.016349 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:40.016041 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:13:40.032206 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:40.032159 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rqns6" podStartSLOduration=66.304422703 podStartE2EDuration="1m9.032144545s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="2026-04-20 21:13:36.879553029 +0000 UTC m=+66.608521606" lastFinishedPulling="2026-04-20 21:13:39.607274867 +0000 UTC m=+69.336243448" observedRunningTime="2026-04-20 21:13:40.031330251 +0000 UTC m=+69.760298851" watchObservedRunningTime="2026-04-20 21:13:40.032144545 +0000 UTC m=+69.761113146" Apr 20 21:13:42.997584 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:42.997552 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6"] Apr 20 21:13:42.999449 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:42.999430 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.001761 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.001738 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 21:13:43.001889 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.001773 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 21:13:43.002571 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.002548 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 21:13:43.002571 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.002562 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 21:13:43.002728 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.002580 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 21:13:43.002728 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.002588 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 21:13:43.002728 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.002612 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 21:13:43.010511 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.010488 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6"] Apr 20 21:13:43.023459 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.023432 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-blrzp" event={"ID":"62995ee3-d913-46d1-a08a-f154a1b3137d","Type":"ContainerStarted","Data":"98ac34d566bf3b2149339c86b82a9470068873564cf91194392c56ae61d0ab52"} Apr 20 21:13:43.042260 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.042218 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-blrzp" podStartSLOduration=66.599767087 podStartE2EDuration="1m12.042208236s" podCreationTimestamp="2026-04-20 21:12:31 +0000 UTC" firstStartedPulling="2026-04-20 21:13:36.860445371 +0000 UTC m=+66.589413948" lastFinishedPulling="2026-04-20 21:13:42.302886518 +0000 UTC m=+72.031855097" observedRunningTime="2026-04-20 21:13:43.041518677 +0000 UTC m=+72.770487276" watchObservedRunningTime="2026-04-20 21:13:43.042208236 +0000 UTC m=+72.771176834" Apr 20 21:13:43.128224 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.128202 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-hub\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.128295 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.128235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-ca\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.128295 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.128251 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.128295 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.128287 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/318fe156-1921-46d6-8501-78f73ec1cdfd-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.128405 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.128358 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.128405 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.128391 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt8qg\" (UniqueName: \"kubernetes.io/projected/318fe156-1921-46d6-8501-78f73ec1cdfd-kube-api-access-rt8qg\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.229288 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.229259 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8qg\" (UniqueName: \"kubernetes.io/projected/318fe156-1921-46d6-8501-78f73ec1cdfd-kube-api-access-rt8qg\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.229387 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.229301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-hub\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.229387 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.229336 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-ca\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.229387 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.229358 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.229500 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.229395 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/318fe156-1921-46d6-8501-78f73ec1cdfd-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.229588 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.229571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.230131 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.230086 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/318fe156-1921-46d6-8501-78f73ec1cdfd-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.231954 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.231927 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-ca\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.232070 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.231972 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.232416 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.232399 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.232467 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.232412 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/318fe156-1921-46d6-8501-78f73ec1cdfd-hub\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.239956 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.239937 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt8qg\" (UniqueName: \"kubernetes.io/projected/318fe156-1921-46d6-8501-78f73ec1cdfd-kube-api-access-rt8qg\") pod \"cluster-proxy-proxy-agent-7867db584d-8scb6\" (UID: \"318fe156-1921-46d6-8501-78f73ec1cdfd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.321549 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.321498 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:13:43.433025 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:43.432978 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6"] Apr 20 21:13:43.435852 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:13:43.435821 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod318fe156_1921_46d6_8501_78f73ec1cdfd.slice/crio-edbff25af55d3cba4d42b18b425e53cd831a2b874c555d74667dfd2b94c70932 WatchSource:0}: Error finding container edbff25af55d3cba4d42b18b425e53cd831a2b874c555d74667dfd2b94c70932: Status 404 returned error can't find the container with id edbff25af55d3cba4d42b18b425e53cd831a2b874c555d74667dfd2b94c70932 Apr 20 21:13:44.026078 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:44.026041 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" event={"ID":"318fe156-1921-46d6-8501-78f73ec1cdfd","Type":"ContainerStarted","Data":"edbff25af55d3cba4d42b18b425e53cd831a2b874c555d74667dfd2b94c70932"} Apr 20 21:13:47.037313 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:47.037271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" event={"ID":"318fe156-1921-46d6-8501-78f73ec1cdfd","Type":"ContainerStarted","Data":"a0b06bb5b5fb42a69c8fb0f8c5c83496880d124b3c82748b4b5f0970a24a9e70"} Apr 20 21:13:49.042790 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:49.042753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" event={"ID":"318fe156-1921-46d6-8501-78f73ec1cdfd","Type":"ContainerStarted","Data":"304df33f63de3aba9ba330468016ee4e6a494da94751ae6c183980311511766e"} Apr 20 21:13:49.042790 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:49.042788 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" event={"ID":"318fe156-1921-46d6-8501-78f73ec1cdfd","Type":"ContainerStarted","Data":"9d7fd709ecf1ff9c50bc115a4a8039aaf911e2b0eaffdb3c33fc098607aa201c"} Apr 20 21:13:49.060980 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:13:49.060933 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" podStartSLOduration=2.333444433 podStartE2EDuration="7.06090799s" podCreationTimestamp="2026-04-20 21:13:42 +0000 UTC" firstStartedPulling="2026-04-20 21:13:43.437582609 +0000 UTC m=+73.166551186" lastFinishedPulling="2026-04-20 21:13:48.165046142 +0000 UTC m=+77.894014743" observedRunningTime="2026-04-20 21:13:49.060254544 +0000 UTC m=+78.789223144" watchObservedRunningTime="2026-04-20 21:13:49.06090799 +0000 UTC m=+78.789876588" Apr 20 21:14:08.506189 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:08.506148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:08.506203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") pod \"image-registry-5c7f65c64-sjpd5\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:08.506224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:08.506309 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:08.506314 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:08.506319 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:08.506334 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c7f65c64-sjpd5: secret "image-registry-tls" not found Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:08.506368 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert podName:dc6985aa-5589-44b0-97f0-b862837c4008 nodeName:}" failed. No retries permitted until 2026-04-20 21:15:12.506354686 +0000 UTC m=+162.235323262 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert") pod "ingress-canary-mm5c7" (UID: "dc6985aa-5589-44b0-97f0-b862837c4008") : secret "canary-serving-cert" not found Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:08.506389 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls podName:c9a1fa00-f1f8-4c50-a5de-e17cf56544f7 nodeName:}" failed. No retries permitted until 2026-04-20 21:15:12.506376005 +0000 UTC m=+162.235344583 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls") pod "image-registry-5c7f65c64-sjpd5" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7") : secret "image-registry-tls" not found Apr 20 21:14:08.506592 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:08.506402 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls podName:770298fa-c6f6-4828-8681-c2a0e5ebc1b5 nodeName:}" failed. No retries permitted until 2026-04-20 21:15:12.506396478 +0000 UTC m=+162.235365056 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls") pod "dns-default-cnvwm" (UID: "770298fa-c6f6-4828-8681-c2a0e5ebc1b5") : secret "dns-default-metrics-tls" not found Apr 20 21:14:11.020755 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:11.020722 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rqns6" Apr 20 21:14:16.095709 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:16.095681 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qfhjp_0b4898da-9e0d-4a11-bec8-8eba5efe7422/dns-node-resolver/0.log" Apr 20 21:14:16.895617 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:16.895590 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s6cxs_3b0c9c36-7f31-4319-bc80-862234ec47e6/node-ca/0.log" Apr 20 21:14:34.142735 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.142699 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kmpf4"] Apr 20 21:14:34.146039 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.146024 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.149318 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.149296 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 21:14:34.149439 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.149296 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 21:14:34.149650 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.149634 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 21:14:34.149714 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.149653 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 21:14:34.149764 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.149731 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4kwqq\"" Apr 20 21:14:34.160981 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.160958 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kmpf4"] Apr 20 21:14:34.292415 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.292378 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/680afb98-3015-4cc5-8729-e0c66f98f554-crio-socket\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.292415 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.292416 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/680afb98-3015-4cc5-8729-e0c66f98f554-data-volume\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.292626 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.292443 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/680afb98-3015-4cc5-8729-e0c66f98f554-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.292626 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.292472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/680afb98-3015-4cc5-8729-e0c66f98f554-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.292626 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.292528 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ng86\" (UniqueName: \"kubernetes.io/projected/680afb98-3015-4cc5-8729-e0c66f98f554-kube-api-access-4ng86\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.394050 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.393931 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/680afb98-3015-4cc5-8729-e0c66f98f554-crio-socket\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.394050 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.394009 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/680afb98-3015-4cc5-8729-e0c66f98f554-data-volume\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.394050 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.394042 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/680afb98-3015-4cc5-8729-e0c66f98f554-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.394311 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.394052 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/680afb98-3015-4cc5-8729-e0c66f98f554-crio-socket\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.394497 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.394463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/680afb98-3015-4cc5-8729-e0c66f98f554-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.394572 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.394531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ng86\" (UniqueName: \"kubernetes.io/projected/680afb98-3015-4cc5-8729-e0c66f98f554-kube-api-access-4ng86\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.394632 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.394614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/680afb98-3015-4cc5-8729-e0c66f98f554-data-volume\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.394729 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.394705 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/680afb98-3015-4cc5-8729-e0c66f98f554-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.397254 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.397200 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/680afb98-3015-4cc5-8729-e0c66f98f554-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.406634 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.406604 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ng86\" (UniqueName: \"kubernetes.io/projected/680afb98-3015-4cc5-8729-e0c66f98f554-kube-api-access-4ng86\") pod \"insights-runtime-extractor-kmpf4\" (UID: \"680afb98-3015-4cc5-8729-e0c66f98f554\") " pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.454246 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.454222 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kmpf4" Apr 20 21:14:34.569478 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:34.569446 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kmpf4"] Apr 20 21:14:34.573237 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:14:34.573192 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680afb98_3015_4cc5_8729_e0c66f98f554.slice/crio-139c4134355acd14eb5a3a14900d9fc3eef43ea795cb45330028f0bb18499267 WatchSource:0}: Error finding container 139c4134355acd14eb5a3a14900d9fc3eef43ea795cb45330028f0bb18499267: Status 404 returned error can't find the container with id 139c4134355acd14eb5a3a14900d9fc3eef43ea795cb45330028f0bb18499267 Apr 20 21:14:35.136708 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:35.136675 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmpf4" event={"ID":"680afb98-3015-4cc5-8729-e0c66f98f554","Type":"ContainerStarted","Data":"bfbad49d0685bfc1b65c9a70af68b8be381f823174dea912a14c9bad0872fad2"} Apr 20 21:14:35.136708 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:35.136710 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmpf4" event={"ID":"680afb98-3015-4cc5-8729-e0c66f98f554","Type":"ContainerStarted","Data":"139c4134355acd14eb5a3a14900d9fc3eef43ea795cb45330028f0bb18499267"} Apr 20 21:14:36.142439 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:36.142398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmpf4" event={"ID":"680afb98-3015-4cc5-8729-e0c66f98f554","Type":"ContainerStarted","Data":"7c465b5ec57f39c75754b206baa0bbc5fbc50234701afd25d3c04f4a79ea17e3"} Apr 20 21:14:37.146326 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:37.146290 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kmpf4" event={"ID":"680afb98-3015-4cc5-8729-e0c66f98f554","Type":"ContainerStarted","Data":"183be2707ae80f8a6e7035a3ed5e5bc622f1f805d5229b4add690bb332c5d0bc"} Apr 20 21:14:37.163247 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:37.163050 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kmpf4" podStartSLOduration=0.861395768 podStartE2EDuration="3.163032856s" podCreationTimestamp="2026-04-20 21:14:34 +0000 UTC" firstStartedPulling="2026-04-20 21:14:34.64607786 +0000 UTC m=+124.375046437" lastFinishedPulling="2026-04-20 21:14:36.947714945 +0000 UTC m=+126.676683525" observedRunningTime="2026-04-20 21:14:37.162951138 +0000 UTC m=+126.891919760" watchObservedRunningTime="2026-04-20 21:14:37.163032856 +0000 UTC m=+126.892001459" Apr 20 21:14:40.540890 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:40.540844 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:14:40.543333 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:40.543305 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03abd218-9d5d-4f78-9ff1-919c66c5417e-metrics-certs\") pod \"network-metrics-daemon-vc5dw\" (UID: \"03abd218-9d5d-4f78-9ff1-919c66c5417e\") " pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:14:40.620086 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:40.620055 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lxd5q\"" Apr 20 21:14:40.627953 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:40.627933 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vc5dw" Apr 20 21:14:40.744880 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:40.744849 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vc5dw"] Apr 20 21:14:40.748644 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:14:40.748616 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03abd218_9d5d_4f78_9ff1_919c66c5417e.slice/crio-56d0e9c86a2634cd353e52399844b5cffd6dde4caf588c06275ea0c636f918c5 WatchSource:0}: Error finding container 56d0e9c86a2634cd353e52399844b5cffd6dde4caf588c06275ea0c636f918c5: Status 404 returned error can't find the container with id 56d0e9c86a2634cd353e52399844b5cffd6dde4caf588c06275ea0c636f918c5 Apr 20 21:14:41.157202 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:41.157164 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vc5dw" event={"ID":"03abd218-9d5d-4f78-9ff1-919c66c5417e","Type":"ContainerStarted","Data":"56d0e9c86a2634cd353e52399844b5cffd6dde4caf588c06275ea0c636f918c5"} Apr 20 21:14:42.163515 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:42.163482 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vc5dw" event={"ID":"03abd218-9d5d-4f78-9ff1-919c66c5417e","Type":"ContainerStarted","Data":"657f0e4b02f3b97cad753e771bc1ab648dd98f210c6aebeba4818fc95db3a5e5"} Apr 20 21:14:42.163515 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:42.163522 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vc5dw" event={"ID":"03abd218-9d5d-4f78-9ff1-919c66c5417e","Type":"ContainerStarted","Data":"252e7c0704a7635a5576c8d0d08453bebab168a0cf59fdbdd01b8faf872440d1"} Apr 20 21:14:42.180377 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:42.180330 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vc5dw" podStartSLOduration=131.209369029 podStartE2EDuration="2m12.180316183s" podCreationTimestamp="2026-04-20 21:12:30 +0000 UTC" firstStartedPulling="2026-04-20 21:14:40.750413984 +0000 UTC m=+130.479382563" lastFinishedPulling="2026-04-20 21:14:41.721361125 +0000 UTC m=+131.450329717" observedRunningTime="2026-04-20 21:14:42.179588968 +0000 UTC m=+131.908557567" watchObservedRunningTime="2026-04-20 21:14:42.180316183 +0000 UTC m=+131.909284781" Apr 20 21:14:52.624435 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.624396 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wr4jc"] Apr 20 21:14:52.627729 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.627710 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.630147 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.630128 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 21:14:52.630359 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.630344 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 21:14:52.630602 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.630588 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 21:14:52.630707 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.630611 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hrxkm\"" Apr 20 21:14:52.630868 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.630855 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 21:14:52.631257 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.631239 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 21:14:52.631349 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.631274 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 21:14:52.724919 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.724885 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-tls\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.725077 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.724941 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-accelerators-collector-config\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.725077 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.724963 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93c295d-891c-4a62-9dd6-ebb8b010f291-metrics-client-ca\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.725149 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.725084 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-root\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.725149 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.725107 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.725209 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.725169 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-textfile\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.725283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.725215 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-sys\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.725283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.725232 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-wtmp\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.725283 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.725248 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qlwv\" (UniqueName: \"kubernetes.io/projected/e93c295d-891c-4a62-9dd6-ebb8b010f291-kube-api-access-9qlwv\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826496 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826460 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-textfile\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826496 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826499 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-sys\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826515 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-wtmp\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qlwv\" (UniqueName: \"kubernetes.io/projected/e93c295d-891c-4a62-9dd6-ebb8b010f291-kube-api-access-9qlwv\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-tls\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826577 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-sys\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-accelerators-collector-config\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826624 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93c295d-891c-4a62-9dd6-ebb8b010f291-metrics-client-ca\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:52.826656 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826663 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-wtmp\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.826694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826690 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-root\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.827056 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:52.826742 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-tls podName:e93c295d-891c-4a62-9dd6-ebb8b010f291 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:53.326722454 +0000 UTC m=+143.055691037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-tls") pod "node-exporter-wr4jc" (UID: "e93c295d-891c-4a62-9dd6-ebb8b010f291") : secret "node-exporter-tls" not found Apr 20 21:14:52.827056 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e93c295d-891c-4a62-9dd6-ebb8b010f291-root\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.827056 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826805 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.827056 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.826841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-textfile\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.827235 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.827217 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-accelerators-collector-config\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.827270 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.827237 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93c295d-891c-4a62-9dd6-ebb8b010f291-metrics-client-ca\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.829246 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.829227 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:52.838479 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:52.838451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qlwv\" (UniqueName: \"kubernetes.io/projected/e93c295d-891c-4a62-9dd6-ebb8b010f291-kube-api-access-9qlwv\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:53.331105 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:53.331071 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-tls\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:53.333404 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:53.333375 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e93c295d-891c-4a62-9dd6-ebb8b010f291-node-exporter-tls\") pod \"node-exporter-wr4jc\" (UID: \"e93c295d-891c-4a62-9dd6-ebb8b010f291\") " pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:53.536820 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:53.536794 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wr4jc" Apr 20 21:14:53.544348 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:14:53.544301 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93c295d_891c_4a62_9dd6_ebb8b010f291.slice/crio-86a3a6c326998188eb19b842fc9a68cc4bbc9a48fe01c862ce93c3ce87349cd1 WatchSource:0}: Error finding container 86a3a6c326998188eb19b842fc9a68cc4bbc9a48fe01c862ce93c3ce87349cd1: Status 404 returned error can't find the container with id 86a3a6c326998188eb19b842fc9a68cc4bbc9a48fe01c862ce93c3ce87349cd1 Apr 20 21:14:54.194052 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:54.194015 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wr4jc" event={"ID":"e93c295d-891c-4a62-9dd6-ebb8b010f291","Type":"ContainerStarted","Data":"86a3a6c326998188eb19b842fc9a68cc4bbc9a48fe01c862ce93c3ce87349cd1"} Apr 20 21:14:55.197613 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.197583 2567 generic.go:358] "Generic (PLEG): container finished" podID="e93c295d-891c-4a62-9dd6-ebb8b010f291" containerID="7102aa99a097d5b061a64f665823d95f753fada9450fcfb825e4b0dd77cfc8bf" exitCode=0 Apr 20 21:14:55.197980 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.197651 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wr4jc" event={"ID":"e93c295d-891c-4a62-9dd6-ebb8b010f291","Type":"ContainerDied","Data":"7102aa99a097d5b061a64f665823d95f753fada9450fcfb825e4b0dd77cfc8bf"} Apr 20 21:14:55.596214 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.596189 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d47bc568f-zcckd"] Apr 20 21:14:55.599476 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.599456 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.601704 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.601680 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 21:14:55.601704 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.601702 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 21:14:55.601886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.601776 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 21:14:55.601886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.601782 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 21:14:55.601886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.601819 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-1jgflfj9309br\"" Apr 20 21:14:55.602084 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.601921 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-9vxln\"" Apr 20 21:14:55.602084 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.602030 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 21:14:55.614300 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.614278 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d47bc568f-zcckd"] Apr 20 21:14:55.751373 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.751318 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.751488 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.751394 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.751488 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.751414 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-grpc-tls\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.751488 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.751440 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-tls\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.751488 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.751459 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8748fbe8-d18a-419c-9436-b306ab28eaf7-metrics-client-ca\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.751488 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.751477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.751654 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.751540 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fn9\" (UniqueName: \"kubernetes.io/projected/8748fbe8-d18a-419c-9436-b306ab28eaf7-kube-api-access-85fn9\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.751654 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.751579 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852145 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852079 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852145 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852125 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852300 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852157 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-grpc-tls\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852300 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852195 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-tls\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852300 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8748fbe8-d18a-419c-9436-b306ab28eaf7-metrics-client-ca\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852300 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852504 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852353 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85fn9\" (UniqueName: \"kubernetes.io/projected/8748fbe8-d18a-419c-9436-b306ab28eaf7-kube-api-access-85fn9\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852504 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.852911 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.852891 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8748fbe8-d18a-419c-9436-b306ab28eaf7-metrics-client-ca\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.855372 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.855347 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.855522 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.855491 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.855968 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.855945 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-grpc-tls\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.856067 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.856031 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.856106 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.856034 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-tls\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.856300 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.856283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8748fbe8-d18a-419c-9436-b306ab28eaf7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.860749 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.860729 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fn9\" (UniqueName: \"kubernetes.io/projected/8748fbe8-d18a-419c-9436-b306ab28eaf7-kube-api-access-85fn9\") pod \"thanos-querier-5d47bc568f-zcckd\" (UID: \"8748fbe8-d18a-419c-9436-b306ab28eaf7\") " pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:55.908242 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:55.908218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:14:56.026158 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:56.026129 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d47bc568f-zcckd"] Apr 20 21:14:56.027739 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:14:56.027708 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8748fbe8_d18a_419c_9436_b306ab28eaf7.slice/crio-9b4020c300ae17d2d2bdd1ab1271bfe5948c3635d1927028b03df25a300587a4 WatchSource:0}: Error finding container 9b4020c300ae17d2d2bdd1ab1271bfe5948c3635d1927028b03df25a300587a4: Status 404 returned error can't find the container with id 9b4020c300ae17d2d2bdd1ab1271bfe5948c3635d1927028b03df25a300587a4 Apr 20 21:14:56.202367 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:56.202271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wr4jc" event={"ID":"e93c295d-891c-4a62-9dd6-ebb8b010f291","Type":"ContainerStarted","Data":"ddbb4768f3a57216260db5cc79c2e8b2b2424b5436f62d9eb607ca86d4615fde"} Apr 20 21:14:56.202367 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:56.202305 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wr4jc" event={"ID":"e93c295d-891c-4a62-9dd6-ebb8b010f291","Type":"ContainerStarted","Data":"80795f719eaeff1d406ff7703fc62d50da1443b15755baedb95080c2897efdc6"} Apr 20 21:14:56.203374 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:56.203351 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" event={"ID":"8748fbe8-d18a-419c-9436-b306ab28eaf7","Type":"ContainerStarted","Data":"9b4020c300ae17d2d2bdd1ab1271bfe5948c3635d1927028b03df25a300587a4"} Apr 20 21:14:56.221979 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:56.221928 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wr4jc" podStartSLOduration=3.376299501 podStartE2EDuration="4.221911926s" podCreationTimestamp="2026-04-20 21:14:52 +0000 UTC" firstStartedPulling="2026-04-20 21:14:53.545922005 +0000 UTC m=+143.274890582" lastFinishedPulling="2026-04-20 21:14:54.391534431 +0000 UTC m=+144.120503007" observedRunningTime="2026-04-20 21:14:56.220663811 +0000 UTC m=+145.949632409" watchObservedRunningTime="2026-04-20 21:14:56.221911926 +0000 UTC m=+145.950880526" Apr 20 21:14:57.502028 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.501980 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b488b889c-x8n2h"] Apr 20 21:14:57.504960 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.504944 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.508096 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.508076 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 21:14:57.508186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.508123 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 21:14:57.508186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.508123 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-69m2r\"" Apr 20 21:14:57.508186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.508166 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 21:14:57.508333 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.508215 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 21:14:57.508401 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.508387 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 21:14:57.508449 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.508390 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 21:14:57.508488 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.508449 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 21:14:57.512157 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.512134 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 21:14:57.517079 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.517060 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b488b889c-x8n2h"] Apr 20 21:14:57.665262 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.665230 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-config\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.665421 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.665285 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-service-ca\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.665421 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.665402 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-serving-cert\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.665534 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.665452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-trusted-ca-bundle\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.665588 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.665568 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-oauth-config\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.665643 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.665623 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72q8n\" (UniqueName: \"kubernetes.io/projected/cb85147b-ab39-4dfd-a682-02f78ceef5df-kube-api-access-72q8n\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.665682 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.665665 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-oauth-serving-cert\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.767159 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.767074 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-serving-cert\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.767159 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.767131 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-trusted-ca-bundle\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.767375 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.767191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-oauth-config\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.767375 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.767229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72q8n\" (UniqueName: \"kubernetes.io/projected/cb85147b-ab39-4dfd-a682-02f78ceef5df-kube-api-access-72q8n\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.767375 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.767281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-oauth-serving-cert\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.767375 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.767327 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-config\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.767563 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.767370 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-service-ca\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.768159 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.768134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-oauth-serving-cert\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.768159 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.768153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-service-ca\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.768328 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.768223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-config\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.768574 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.768548 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-trusted-ca-bundle\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.769919 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.769899 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-serving-cert\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.770087 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.770069 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-oauth-config\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.775179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.775146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72q8n\" (UniqueName: \"kubernetes.io/projected/cb85147b-ab39-4dfd-a682-02f78ceef5df-kube-api-access-72q8n\") pod \"console-6b488b889c-x8n2h\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.814625 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.814599 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:14:57.815073 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.815032 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c7f65c64-sjpd5"] Apr 20 21:14:57.815562 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:14:57.815263 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" podUID="c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" Apr 20 21:14:57.819273 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.819251 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-8ccc47d7c-frb8n"] Apr 20 21:14:57.823902 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.823884 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:57.826178 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.826149 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 21:14:57.826178 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.826169 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 21:14:57.826344 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.826149 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 21:14:57.826344 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.826196 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 21:14:57.826344 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.826149 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 21:14:57.826344 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.826152 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-kpt47\"" Apr 20 21:14:57.833549 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.833510 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 21:14:57.834411 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.834390 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8ccc47d7c-frb8n"] Apr 20 21:14:57.969451 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.969420 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:57.969582 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.969481 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-secret-telemeter-client\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:57.969582 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.969508 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:57.969582 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.969533 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-serving-certs-ca-bundle\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:57.969582 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.969557 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-metrics-client-ca\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:57.969749 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.969613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-telemeter-client-tls\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:57.969749 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.969644 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkqh5\" (UniqueName: \"kubernetes.io/projected/607c02c5-695a-4a3d-bf04-683531b59ce0-kube-api-access-kkqh5\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:57.969749 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:57.969697 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-federate-client-tls\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.070275 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.070176 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.070275 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.070253 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-secret-telemeter-client\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.070477 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.070294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.070477 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.070319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-serving-certs-ca-bundle\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.070477 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.070346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-metrics-client-ca\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.070477 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.070367 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-telemeter-client-tls\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.070477 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.070382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkqh5\" (UniqueName: \"kubernetes.io/projected/607c02c5-695a-4a3d-bf04-683531b59ce0-kube-api-access-kkqh5\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.070477 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.070417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-federate-client-tls\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.071280 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.071255 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-metrics-client-ca\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.071512 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.071485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-serving-certs-ca-bundle\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.072000 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.071963 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607c02c5-695a-4a3d-bf04-683531b59ce0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.073389 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.073316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-telemeter-client-tls\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.073389 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.073348 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.073755 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.073729 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-federate-client-tls\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.073978 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.073954 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/607c02c5-695a-4a3d-bf04-683531b59ce0-secret-telemeter-client\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.078473 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.078452 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkqh5\" (UniqueName: \"kubernetes.io/projected/607c02c5-695a-4a3d-bf04-683531b59ce0-kube-api-access-kkqh5\") pod \"telemeter-client-8ccc47d7c-frb8n\" (UID: \"607c02c5-695a-4a3d-bf04-683531b59ce0\") " pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.136022 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.135998 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" Apr 20 21:14:58.211651 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.211463 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:14:58.218492 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.218432 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:14:58.290457 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.290431 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8ccc47d7c-frb8n"] Apr 20 21:14:58.292686 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:14:58.292657 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607c02c5_695a_4a3d_bf04_683531b59ce0.slice/crio-27dd81968556be7d80fc0bdacc19691674eafd9a90468e507e490065d4e0821a WatchSource:0}: Error finding container 27dd81968556be7d80fc0bdacc19691674eafd9a90468e507e490065d4e0821a: Status 404 returned error can't find the container with id 27dd81968556be7d80fc0bdacc19691674eafd9a90468e507e490065d4e0821a Apr 20 21:14:58.305163 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.304035 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b488b889c-x8n2h"] Apr 20 21:14:58.306078 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:14:58.306040 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb85147b_ab39_4dfd_a682_02f78ceef5df.slice/crio-41df4f8fb5a794b091c3b70dad85b8280ee451f2f48888e02f4b228478151f90 WatchSource:0}: Error finding container 41df4f8fb5a794b091c3b70dad85b8280ee451f2f48888e02f4b228478151f90: Status 404 returned error can't find the container with id 41df4f8fb5a794b091c3b70dad85b8280ee451f2f48888e02f4b228478151f90 Apr 20 21:14:58.373230 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373199 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-certificates\") pod \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " Apr 20 21:14:58.373339 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373246 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmw6d\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-kube-api-access-tmw6d\") pod \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " Apr 20 21:14:58.373400 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373362 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-trusted-ca\") pod \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " Apr 20 21:14:58.373578 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373554 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:14:58.373650 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373627 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-image-registry-private-configuration\") pod \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " Apr 20 21:14:58.373713 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373677 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-bound-sa-token\") pod \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " Apr 20 21:14:58.373771 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373726 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-installation-pull-secrets\") pod \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " Apr 20 21:14:58.373771 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373727 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:14:58.373870 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.373769 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-ca-trust-extracted\") pod \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\" (UID: \"c9a1fa00-f1f8-4c50-a5de-e17cf56544f7\") " Apr 20 21:14:58.374203 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.374106 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-certificates\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:14:58.374203 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.374145 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-trusted-ca\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:14:58.374203 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.374165 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:14:58.375844 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.375820 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-kube-api-access-tmw6d" (OuterVolumeSpecName: "kube-api-access-tmw6d") pod "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7"). InnerVolumeSpecName "kube-api-access-tmw6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:14:58.375910 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.375841 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:14:58.376404 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.376380 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:14:58.376609 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.376591 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" (UID: "c9a1fa00-f1f8-4c50-a5de-e17cf56544f7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:14:58.475221 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.475191 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-image-registry-private-configuration\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:14:58.475221 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.475215 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-bound-sa-token\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:14:58.475221 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.475225 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-installation-pull-secrets\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:14:58.475407 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.475234 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-ca-trust-extracted\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:14:58.475407 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:58.475244 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmw6d\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-kube-api-access-tmw6d\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:14:59.217131 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.217092 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" event={"ID":"8748fbe8-d18a-419c-9436-b306ab28eaf7","Type":"ContainerStarted","Data":"6911946bd989fbaf420586ee75f234bec1e93f37467ad93c5e84d5bb6e1668a8"} Apr 20 21:14:59.217577 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.217139 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" event={"ID":"8748fbe8-d18a-419c-9436-b306ab28eaf7","Type":"ContainerStarted","Data":"15a5c5c0a3ecc1f35f9168deecb50a5fa768a2a7a3491cb090bb10ed6174acfa"} Apr 20 21:14:59.217577 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.217153 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" event={"ID":"8748fbe8-d18a-419c-9436-b306ab28eaf7","Type":"ContainerStarted","Data":"94d056237e9637a8455cc54734ba1d460bf2ded99b096820612c7f676c0b4990"} Apr 20 21:14:59.218317 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.218275 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" event={"ID":"607c02c5-695a-4a3d-bf04-683531b59ce0","Type":"ContainerStarted","Data":"27dd81968556be7d80fc0bdacc19691674eafd9a90468e507e490065d4e0821a"} Apr 20 21:14:59.220569 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.219819 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b488b889c-x8n2h" event={"ID":"cb85147b-ab39-4dfd-a682-02f78ceef5df","Type":"ContainerStarted","Data":"41df4f8fb5a794b091c3b70dad85b8280ee451f2f48888e02f4b228478151f90"} Apr 20 21:14:59.220569 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.219867 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c7f65c64-sjpd5" Apr 20 21:14:59.251844 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.251819 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c7f65c64-sjpd5"] Apr 20 21:14:59.255178 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.254933 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c7f65c64-sjpd5"] Apr 20 21:14:59.383602 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:14:59.383576 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7-registry-tls\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:15:00.231476 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:00.231440 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" event={"ID":"8748fbe8-d18a-419c-9436-b306ab28eaf7","Type":"ContainerStarted","Data":"0a209dd00629c0337ff078f59ef27893868eac71ce0242457c889aa35daac9e9"} Apr 20 21:15:00.231476 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:00.231479 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" event={"ID":"8748fbe8-d18a-419c-9436-b306ab28eaf7","Type":"ContainerStarted","Data":"c3b1c151d660f71bcc05353a28c64d39ad97cec26a7f7014f459458e30f1006b"} Apr 20 21:15:00.231981 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:00.231497 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" event={"ID":"8748fbe8-d18a-419c-9436-b306ab28eaf7","Type":"ContainerStarted","Data":"d0b99f1eef929d8e63b10273f2dcfa8085be5bc2fbb413595509667059e158f9"} Apr 20 21:15:00.231981 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:00.231675 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:15:00.254034 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:00.253965 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" podStartSLOduration=2.038356392 podStartE2EDuration="5.253947732s" podCreationTimestamp="2026-04-20 21:14:55 +0000 UTC" firstStartedPulling="2026-04-20 21:14:56.029629457 +0000 UTC m=+145.758598033" lastFinishedPulling="2026-04-20 21:14:59.245220789 +0000 UTC m=+148.974189373" observedRunningTime="2026-04-20 21:15:00.252037522 +0000 UTC m=+149.981006124" watchObservedRunningTime="2026-04-20 21:15:00.253947732 +0000 UTC m=+149.982916333" Apr 20 21:15:00.807187 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:00.807151 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a1fa00-f1f8-4c50-a5de-e17cf56544f7" path="/var/lib/kubelet/pods/c9a1fa00-f1f8-4c50-a5de-e17cf56544f7/volumes" Apr 20 21:15:01.235172 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:01.235134 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" event={"ID":"607c02c5-695a-4a3d-bf04-683531b59ce0","Type":"ContainerStarted","Data":"37de225271537d0df9f5fc5c6f16ccfb1e58079f02cb727a18f323f0beb7dffd"} Apr 20 21:15:01.236591 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:01.236562 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b488b889c-x8n2h" event={"ID":"cb85147b-ab39-4dfd-a682-02f78ceef5df","Type":"ContainerStarted","Data":"a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218"} Apr 20 21:15:01.254051 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:01.253980 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b488b889c-x8n2h" podStartSLOduration=1.465639035 podStartE2EDuration="4.253964052s" podCreationTimestamp="2026-04-20 21:14:57 +0000 UTC" firstStartedPulling="2026-04-20 21:14:58.307736775 +0000 UTC m=+148.036705352" lastFinishedPulling="2026-04-20 21:15:01.096061788 +0000 UTC m=+150.825030369" observedRunningTime="2026-04-20 21:15:01.253487728 +0000 UTC m=+150.982456327" watchObservedRunningTime="2026-04-20 21:15:01.253964052 +0000 UTC m=+150.982932652" Apr 20 21:15:03.246918 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:03.246874 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" event={"ID":"607c02c5-695a-4a3d-bf04-683531b59ce0","Type":"ContainerStarted","Data":"247b1e0bcf455684544e44d3c2243fb718b3080d445a569945566e04979256af"} Apr 20 21:15:03.246918 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:03.246915 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" event={"ID":"607c02c5-695a-4a3d-bf04-683531b59ce0","Type":"ContainerStarted","Data":"7529dbda59b8db57038eec1ea1bb56c9c4f0e9894ccc969a8cc6f7deb68603bb"} Apr 20 21:15:03.271727 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:03.271669 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-8ccc47d7c-frb8n" podStartSLOduration=2.375523628 podStartE2EDuration="6.271649588s" podCreationTimestamp="2026-04-20 21:14:57 +0000 UTC" firstStartedPulling="2026-04-20 21:14:58.295512013 +0000 UTC m=+148.024480590" lastFinishedPulling="2026-04-20 21:15:02.19163797 +0000 UTC m=+151.920606550" observedRunningTime="2026-04-20 21:15:03.270264017 +0000 UTC m=+152.999232618" watchObservedRunningTime="2026-04-20 21:15:03.271649588 +0000 UTC m=+153.000618188" Apr 20 21:15:03.323214 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:03.323158 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" podUID="318fe156-1921-46d6-8501-78f73ec1cdfd" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 21:15:03.836949 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:03.836916 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b488b889c-x8n2h"] Apr 20 21:15:06.243381 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:06.243352 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d47bc568f-zcckd" Apr 20 21:15:07.673100 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:15:07.673054 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cnvwm" podUID="770298fa-c6f6-4828-8681-c2a0e5ebc1b5" Apr 20 21:15:07.685395 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:15:07.685356 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-mm5c7" podUID="dc6985aa-5589-44b0-97f0-b862837c4008" Apr 20 21:15:07.814909 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:07.814870 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:15:08.262786 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:08.262751 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cnvwm" Apr 20 21:15:12.580928 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:12.580887 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:15:12.580928 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:12.580946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:15:12.583385 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:12.583356 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/770298fa-c6f6-4828-8681-c2a0e5ebc1b5-metrics-tls\") pod \"dns-default-cnvwm\" (UID: \"770298fa-c6f6-4828-8681-c2a0e5ebc1b5\") " pod="openshift-dns/dns-default-cnvwm" Apr 20 21:15:12.583528 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:12.583433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6985aa-5589-44b0-97f0-b862837c4008-cert\") pod \"ingress-canary-mm5c7\" (UID: \"dc6985aa-5589-44b0-97f0-b862837c4008\") " pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:15:12.766189 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:12.766162 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtslr\"" Apr 20 21:15:12.774672 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:12.774655 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cnvwm" Apr 20 21:15:12.892802 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:12.892733 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cnvwm"] Apr 20 21:15:12.895250 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:15:12.895217 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod770298fa_c6f6_4828_8681_c2a0e5ebc1b5.slice/crio-e70dc3cf5ac834d826b960838492296b930faf7c3db030f3b1a1da4995faaffe WatchSource:0}: Error finding container e70dc3cf5ac834d826b960838492296b930faf7c3db030f3b1a1da4995faaffe: Status 404 returned error can't find the container with id e70dc3cf5ac834d826b960838492296b930faf7c3db030f3b1a1da4995faaffe Apr 20 21:15:13.282909 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:13.282771 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cnvwm" event={"ID":"770298fa-c6f6-4828-8681-c2a0e5ebc1b5","Type":"ContainerStarted","Data":"e70dc3cf5ac834d826b960838492296b930faf7c3db030f3b1a1da4995faaffe"} Apr 20 21:15:13.323024 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:13.322969 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" podUID="318fe156-1921-46d6-8501-78f73ec1cdfd" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 21:15:14.287289 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:14.287251 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cnvwm" event={"ID":"770298fa-c6f6-4828-8681-c2a0e5ebc1b5","Type":"ContainerStarted","Data":"eecadcdf65b3d9565483cc6b6b8912ce2256321fbe1672a8b17fef228b298b8a"} Apr 20 21:15:15.291267 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:15.291227 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cnvwm" event={"ID":"770298fa-c6f6-4828-8681-c2a0e5ebc1b5","Type":"ContainerStarted","Data":"c5cc53a2ebf6d08614d963accf797e4447c46c5b3fc2624cbe6a7fb7a629f37a"} Apr 20 21:15:15.291653 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:15.291327 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cnvwm" Apr 20 21:15:15.308959 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:15.308913 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cnvwm" podStartSLOduration=130.1028877 podStartE2EDuration="2m11.308901445s" podCreationTimestamp="2026-04-20 21:13:04 +0000 UTC" firstStartedPulling="2026-04-20 21:15:12.897123985 +0000 UTC m=+162.626092562" lastFinishedPulling="2026-04-20 21:15:14.103137726 +0000 UTC m=+163.832106307" observedRunningTime="2026-04-20 21:15:15.307167997 +0000 UTC m=+165.036136597" watchObservedRunningTime="2026-04-20 21:15:15.308901445 +0000 UTC m=+165.037870043" Apr 20 21:15:19.803670 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:19.803637 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:15:19.805920 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:19.805898 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kkzr9\"" Apr 20 21:15:19.814679 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:19.814657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mm5c7" Apr 20 21:15:19.931137 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:19.931105 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mm5c7"] Apr 20 21:15:19.934277 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:15:19.934246 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6985aa_5589_44b0_97f0_b862837c4008.slice/crio-fec6ef0b45d5263cfe5dc6deb9dde29f0e52eb172ceb9934ff1577e4ad6fcff2 WatchSource:0}: Error finding container fec6ef0b45d5263cfe5dc6deb9dde29f0e52eb172ceb9934ff1577e4ad6fcff2: Status 404 returned error can't find the container with id fec6ef0b45d5263cfe5dc6deb9dde29f0e52eb172ceb9934ff1577e4ad6fcff2 Apr 20 21:15:20.304797 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:20.304760 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mm5c7" event={"ID":"dc6985aa-5589-44b0-97f0-b862837c4008","Type":"ContainerStarted","Data":"fec6ef0b45d5263cfe5dc6deb9dde29f0e52eb172ceb9934ff1577e4ad6fcff2"} Apr 20 21:15:22.312805 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:22.312775 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mm5c7" event={"ID":"dc6985aa-5589-44b0-97f0-b862837c4008","Type":"ContainerStarted","Data":"b0945e8cbccb3c0aabdd88e44fdcd961ff6c4751d70a7f307cfdb86d88e97f27"} Apr 20 21:15:22.327789 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:22.327746 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mm5c7" podStartSLOduration=136.633645145 podStartE2EDuration="2m18.327733135s" podCreationTimestamp="2026-04-20 21:13:04 +0000 UTC" firstStartedPulling="2026-04-20 21:15:19.936177054 +0000 UTC m=+169.665145633" lastFinishedPulling="2026-04-20 21:15:21.630265034 +0000 UTC m=+171.359233623" observedRunningTime="2026-04-20 21:15:22.327453622 +0000 UTC m=+172.056422222" watchObservedRunningTime="2026-04-20 21:15:22.327733135 +0000 UTC m=+172.056701733" Apr 20 21:15:23.322278 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:23.322242 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" podUID="318fe156-1921-46d6-8501-78f73ec1cdfd" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 21:15:23.322633 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:23.322304 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" Apr 20 21:15:23.322757 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:23.322728 2567 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"304df33f63de3aba9ba330468016ee4e6a494da94751ae6c183980311511766e"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 21:15:23.322796 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:23.322784 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" podUID="318fe156-1921-46d6-8501-78f73ec1cdfd" containerName="service-proxy" containerID="cri-o://304df33f63de3aba9ba330468016ee4e6a494da94751ae6c183980311511766e" gracePeriod=30 Apr 20 21:15:24.319743 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:24.319708 2567 generic.go:358] "Generic (PLEG): container finished" podID="318fe156-1921-46d6-8501-78f73ec1cdfd" containerID="304df33f63de3aba9ba330468016ee4e6a494da94751ae6c183980311511766e" exitCode=2 Apr 20 21:15:24.319913 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:24.319776 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" event={"ID":"318fe156-1921-46d6-8501-78f73ec1cdfd","Type":"ContainerDied","Data":"304df33f63de3aba9ba330468016ee4e6a494da94751ae6c183980311511766e"} Apr 20 21:15:24.319913 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:24.319810 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7867db584d-8scb6" event={"ID":"318fe156-1921-46d6-8501-78f73ec1cdfd","Type":"ContainerStarted","Data":"a68974c1c1f7474c46b4e3f6b8e0246f27beabcf98643ea73c4861efb12a264d"} Apr 20 21:15:25.295847 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:25.295811 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cnvwm" Apr 20 21:15:28.855876 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:28.855841 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b488b889c-x8n2h" podUID="cb85147b-ab39-4dfd-a682-02f78ceef5df" containerName="console" containerID="cri-o://a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218" gracePeriod=15 Apr 20 21:15:29.089763 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.089741 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b488b889c-x8n2h_cb85147b-ab39-4dfd-a682-02f78ceef5df/console/0.log" Apr 20 21:15:29.089912 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.089806 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:15:29.199190 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199121 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-config\") pod \"cb85147b-ab39-4dfd-a682-02f78ceef5df\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " Apr 20 21:15:29.199190 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199157 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-service-ca\") pod \"cb85147b-ab39-4dfd-a682-02f78ceef5df\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " Apr 20 21:15:29.199369 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199199 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-oauth-config\") pod \"cb85147b-ab39-4dfd-a682-02f78ceef5df\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " Apr 20 21:15:29.199369 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199249 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-trusted-ca-bundle\") pod \"cb85147b-ab39-4dfd-a682-02f78ceef5df\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " Apr 20 21:15:29.199369 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199286 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-oauth-serving-cert\") pod \"cb85147b-ab39-4dfd-a682-02f78ceef5df\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " Apr 20 21:15:29.199369 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199345 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72q8n\" (UniqueName: \"kubernetes.io/projected/cb85147b-ab39-4dfd-a682-02f78ceef5df-kube-api-access-72q8n\") pod \"cb85147b-ab39-4dfd-a682-02f78ceef5df\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " Apr 20 21:15:29.199568 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199378 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-serving-cert\") pod \"cb85147b-ab39-4dfd-a682-02f78ceef5df\" (UID: \"cb85147b-ab39-4dfd-a682-02f78ceef5df\") " Apr 20 21:15:29.199621 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199589 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-config" (OuterVolumeSpecName: "console-config") pod "cb85147b-ab39-4dfd-a682-02f78ceef5df" (UID: "cb85147b-ab39-4dfd-a682-02f78ceef5df"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:29.199707 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199674 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-service-ca" (OuterVolumeSpecName: "service-ca") pod "cb85147b-ab39-4dfd-a682-02f78ceef5df" (UID: "cb85147b-ab39-4dfd-a682-02f78ceef5df"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:29.199806 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199709 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cb85147b-ab39-4dfd-a682-02f78ceef5df" (UID: "cb85147b-ab39-4dfd-a682-02f78ceef5df"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:29.199806 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.199724 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cb85147b-ab39-4dfd-a682-02f78ceef5df" (UID: "cb85147b-ab39-4dfd-a682-02f78ceef5df"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:29.201679 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.201658 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb85147b-ab39-4dfd-a682-02f78ceef5df-kube-api-access-72q8n" (OuterVolumeSpecName: "kube-api-access-72q8n") pod "cb85147b-ab39-4dfd-a682-02f78ceef5df" (UID: "cb85147b-ab39-4dfd-a682-02f78ceef5df"). InnerVolumeSpecName "kube-api-access-72q8n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:15:29.201740 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.201699 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cb85147b-ab39-4dfd-a682-02f78ceef5df" (UID: "cb85147b-ab39-4dfd-a682-02f78ceef5df"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:15:29.201799 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.201757 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cb85147b-ab39-4dfd-a682-02f78ceef5df" (UID: "cb85147b-ab39-4dfd-a682-02f78ceef5df"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:15:29.300236 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.300204 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-serving-cert\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:15:29.300236 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.300232 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-config\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:15:29.300382 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.300246 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-service-ca\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:15:29.300382 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.300259 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb85147b-ab39-4dfd-a682-02f78ceef5df-console-oauth-config\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:15:29.300382 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.300273 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-trusted-ca-bundle\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:15:29.300382 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.300285 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb85147b-ab39-4dfd-a682-02f78ceef5df-oauth-serving-cert\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:15:29.300382 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.300297 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72q8n\" (UniqueName: \"kubernetes.io/projected/cb85147b-ab39-4dfd-a682-02f78ceef5df-kube-api-access-72q8n\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:15:29.339495 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.339471 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b488b889c-x8n2h_cb85147b-ab39-4dfd-a682-02f78ceef5df/console/0.log" Apr 20 21:15:29.339598 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.339508 2567 generic.go:358] "Generic (PLEG): container finished" podID="cb85147b-ab39-4dfd-a682-02f78ceef5df" containerID="a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218" exitCode=2 Apr 20 21:15:29.339598 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.339547 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b488b889c-x8n2h" event={"ID":"cb85147b-ab39-4dfd-a682-02f78ceef5df","Type":"ContainerDied","Data":"a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218"} Apr 20 21:15:29.339598 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.339568 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b488b889c-x8n2h" event={"ID":"cb85147b-ab39-4dfd-a682-02f78ceef5df","Type":"ContainerDied","Data":"41df4f8fb5a794b091c3b70dad85b8280ee451f2f48888e02f4b228478151f90"} Apr 20 21:15:29.339598 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.339573 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b488b889c-x8n2h" Apr 20 21:15:29.339598 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.339582 2567 scope.go:117] "RemoveContainer" containerID="a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218" Apr 20 21:15:29.348147 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.348128 2567 scope.go:117] "RemoveContainer" containerID="a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218" Apr 20 21:15:29.348399 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:15:29.348373 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218\": container with ID starting with a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218 not found: ID does not exist" containerID="a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218" Apr 20 21:15:29.348480 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.348403 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218"} err="failed to get container status \"a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218\": rpc error: code = NotFound desc = could not find container \"a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218\": container with ID starting with a3aef98a21cd3f1afddcff5805c4bd1821d49fee58342bcba950f9029a0a2218 not found: ID does not exist" Apr 20 21:15:29.358913 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.358893 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b488b889c-x8n2h"] Apr 20 21:15:29.363006 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:29.362965 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b488b889c-x8n2h"] Apr 20 21:15:30.806417 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:15:30.806391 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb85147b-ab39-4dfd-a682-02f78ceef5df" path="/var/lib/kubelet/pods/cb85147b-ab39-4dfd-a682-02f78ceef5df/volumes" Apr 20 21:16:14.896786 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.896750 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:16:14.897231 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.897127 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb85147b-ab39-4dfd-a682-02f78ceef5df" containerName="console" Apr 20 21:16:14.897231 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.897142 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb85147b-ab39-4dfd-a682-02f78ceef5df" containerName="console" Apr 20 21:16:14.897231 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.897204 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb85147b-ab39-4dfd-a682-02f78ceef5df" containerName="console" Apr 20 21:16:14.905484 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.905459 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:14.908268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.908244 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 21:16:14.908268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.908259 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 21:16:14.908747 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.908731 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kgbrw\"" Apr 20 21:16:14.908946 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.908797 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 21:16:14.909392 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.909376 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 21:16:14.909650 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.909628 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 21:16:14.909743 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.909664 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 21:16:14.909798 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.909613 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 21:16:14.910036 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.909906 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 21:16:14.910144 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.910121 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:16:14.913694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:14.913672 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 21:16:15.012560 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012539 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012662 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012662 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012662 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012644 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012758 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012678 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012758 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012700 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012758 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012749 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-web-config\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012847 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012770 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-config-out\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012847 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012787 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012847 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012802 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012847 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012824 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012966 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012845 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89nbl\" (UniqueName: \"kubernetes.io/projected/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-kube-api-access-89nbl\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.012966 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.012876 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113319 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113293 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113430 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113325 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113430 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113537 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113476 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-web-config\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113537 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113507 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-config-out\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113537 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113527 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113680 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113680 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113568 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113680 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113593 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89nbl\" (UniqueName: \"kubernetes.io/projected/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-kube-api-access-89nbl\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113680 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113623 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113680 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113651 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113922 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113688 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.113922 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.113715 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.114230 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.114044 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.114956 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.114934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.115154 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.115129 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.116481 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.116456 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.117093 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.116867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-config-out\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.117093 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.116944 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.117093 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.117048 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-web-config\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.117288 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.117120 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.117288 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.117255 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.117507 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.117487 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.117733 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.117717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.118911 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.118895 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.122243 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.122222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89nbl\" (UniqueName: \"kubernetes.io/projected/3e2c9a4b-b674-44a4-bd07-72f35fda57b0-kube-api-access-89nbl\") pod \"alertmanager-main-0\" (UID: \"3e2c9a4b-b674-44a4-bd07-72f35fda57b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.215493 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.215436 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:15.340533 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.340507 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:16:15.342876 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:16:15.342851 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e2c9a4b_b674_44a4_bd07_72f35fda57b0.slice/crio-89a956b7795bc457368c60c2d3faa16009ceb00cf2dbd77c056759a976b5e341 WatchSource:0}: Error finding container 89a956b7795bc457368c60c2d3faa16009ceb00cf2dbd77c056759a976b5e341: Status 404 returned error can't find the container with id 89a956b7795bc457368c60c2d3faa16009ceb00cf2dbd77c056759a976b5e341 Apr 20 21:16:15.464462 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.464433 2567 generic.go:358] "Generic (PLEG): container finished" podID="3e2c9a4b-b674-44a4-bd07-72f35fda57b0" containerID="8cffd57e4cafc0db71fd9f66f7f246fef76224a273153a9dc8ed995e6ae726e8" exitCode=0 Apr 20 21:16:15.464559 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.464481 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e2c9a4b-b674-44a4-bd07-72f35fda57b0","Type":"ContainerDied","Data":"8cffd57e4cafc0db71fd9f66f7f246fef76224a273153a9dc8ed995e6ae726e8"} Apr 20 21:16:15.464559 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:15.464502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e2c9a4b-b674-44a4-bd07-72f35fda57b0","Type":"ContainerStarted","Data":"89a956b7795bc457368c60c2d3faa16009ceb00cf2dbd77c056759a976b5e341"} Apr 20 21:16:17.472355 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:17.472281 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e2c9a4b-b674-44a4-bd07-72f35fda57b0","Type":"ContainerStarted","Data":"d6575653c86c1e90eef3946284ec2a129b9ff7d2488c4a3a9c2a8adcf4a88937"} Apr 20 21:16:17.472355 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:17.472315 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e2c9a4b-b674-44a4-bd07-72f35fda57b0","Type":"ContainerStarted","Data":"db698c42fff89deaf385c638e8bd93a5de3bcda1dbf8c580409812a7ad7a6dea"} Apr 20 21:16:17.472355 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:17.472325 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e2c9a4b-b674-44a4-bd07-72f35fda57b0","Type":"ContainerStarted","Data":"0999514b17edfc35724433651bf19a9b82f484f53f6b833ffba00ae3b5b172b9"} Apr 20 21:16:17.472355 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:17.472333 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e2c9a4b-b674-44a4-bd07-72f35fda57b0","Type":"ContainerStarted","Data":"826dd225224c5d707e78e1d13e13a7dfd19211e7c54cc56035dab6ccfe2548c3"} Apr 20 21:16:17.472355 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:17.472340 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e2c9a4b-b674-44a4-bd07-72f35fda57b0","Type":"ContainerStarted","Data":"02bfb391ae37202a6008642f60a42b55bfe4b80ef16a87ad7bea759c707417f6"} Apr 20 21:16:17.472355 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:17.472350 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3e2c9a4b-b674-44a4-bd07-72f35fda57b0","Type":"ContainerStarted","Data":"09c329b4e8e924469b4aab7d20af2a3d96a32e40d6ef6b4288ee593121144b5b"} Apr 20 21:16:17.500424 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:17.500368 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.074161607 podStartE2EDuration="3.500351145s" podCreationTimestamp="2026-04-20 21:16:14 +0000 UTC" firstStartedPulling="2026-04-20 21:16:15.465533055 +0000 UTC m=+225.194501632" lastFinishedPulling="2026-04-20 21:16:16.891722581 +0000 UTC m=+226.620691170" observedRunningTime="2026-04-20 21:16:17.497758035 +0000 UTC m=+227.226726635" watchObservedRunningTime="2026-04-20 21:16:17.500351145 +0000 UTC m=+227.229319748" Apr 20 21:16:23.950308 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.950269 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cc87fbcf9-826x6"] Apr 20 21:16:23.953536 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.953513 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:23.955817 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.955799 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 21:16:23.955908 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.955825 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 21:16:23.956707 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.956691 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 21:16:23.957038 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.957016 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-69m2r\"" Apr 20 21:16:23.957124 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.957037 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 21:16:23.957124 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.957068 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 21:16:23.957124 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.957045 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 21:16:23.957124 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.957020 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 21:16:23.961520 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.961498 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 21:16:23.962671 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:23.962651 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cc87fbcf9-826x6"] Apr 20 21:16:24.077662 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.077627 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-service-ca\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.077662 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.077661 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-oauth-serving-cert\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.077899 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.077687 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gghqp\" (UniqueName: \"kubernetes.io/projected/261cec5b-ea8b-4b66-b5e7-2f33ae892080-kube-api-access-gghqp\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.077899 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.077790 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-trusted-ca-bundle\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.077899 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.077834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-config\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.077899 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.077884 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-oauth-config\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.078108 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.077954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-serving-cert\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.178353 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.178317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gghqp\" (UniqueName: \"kubernetes.io/projected/261cec5b-ea8b-4b66-b5e7-2f33ae892080-kube-api-access-gghqp\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.178441 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.178362 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-trusted-ca-bundle\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.178511 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.178488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-config\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.178562 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.178540 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-oauth-config\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.178618 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.178606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-serving-cert\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.178669 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.178642 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-service-ca\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.178719 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.178676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-oauth-serving-cert\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.179287 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.179263 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-config\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.179402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.179385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-trusted-ca-bundle\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.179454 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.179415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-service-ca\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.179549 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.179529 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-oauth-serving-cert\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.181066 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.181038 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-oauth-config\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.181159 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.181150 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-serving-cert\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.186038 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.186019 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gghqp\" (UniqueName: \"kubernetes.io/projected/261cec5b-ea8b-4b66-b5e7-2f33ae892080-kube-api-access-gghqp\") pod \"console-5cc87fbcf9-826x6\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.263148 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.263067 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:24.397395 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.397364 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cc87fbcf9-826x6"] Apr 20 21:16:24.400635 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:16:24.400611 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod261cec5b_ea8b_4b66_b5e7_2f33ae892080.slice/crio-a14049352de773550d3c1e9616598171ee6344b67a35b01155c25e0632209cac WatchSource:0}: Error finding container a14049352de773550d3c1e9616598171ee6344b67a35b01155c25e0632209cac: Status 404 returned error can't find the container with id a14049352de773550d3c1e9616598171ee6344b67a35b01155c25e0632209cac Apr 20 21:16:24.493537 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.493496 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cc87fbcf9-826x6" event={"ID":"261cec5b-ea8b-4b66-b5e7-2f33ae892080","Type":"ContainerStarted","Data":"09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7"} Apr 20 21:16:24.493537 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.493536 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cc87fbcf9-826x6" event={"ID":"261cec5b-ea8b-4b66-b5e7-2f33ae892080","Type":"ContainerStarted","Data":"a14049352de773550d3c1e9616598171ee6344b67a35b01155c25e0632209cac"} Apr 20 21:16:24.509939 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:24.509882 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cc87fbcf9-826x6" podStartSLOduration=1.509865802 podStartE2EDuration="1.509865802s" podCreationTimestamp="2026-04-20 21:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:16:24.508547248 +0000 UTC m=+234.237515847" watchObservedRunningTime="2026-04-20 21:16:24.509865802 +0000 UTC m=+234.238834401" Apr 20 21:16:34.263473 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:34.263441 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:34.263835 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:34.263522 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:34.268123 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:34.268103 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:16:34.525517 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:16:34.525444 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:17:02.197344 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.197263 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7"] Apr 20 21:17:02.202102 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.202083 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.204350 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.204327 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:17:02.204468 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.204381 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:17:02.205052 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.205038 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wh4tl\"" Apr 20 21:17:02.209717 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.209692 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7"] Apr 20 21:17:02.350539 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.350517 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.350657 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.350555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsz9p\" (UniqueName: \"kubernetes.io/projected/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-kube-api-access-nsz9p\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.350657 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.350623 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.451491 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.451438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.451595 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.451498 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.451595 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.451520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsz9p\" (UniqueName: \"kubernetes.io/projected/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-kube-api-access-nsz9p\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.451790 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.451760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.451850 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.451807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.459464 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.459441 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsz9p\" (UniqueName: \"kubernetes.io/projected/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-kube-api-access-nsz9p\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.511406 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.511378 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:02.625951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:02.625924 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7"] Apr 20 21:17:02.628642 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:17:02.628612 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c6ba84_9851_4d59_8bbc_1a230cbed5f7.slice/crio-127071a6b4376a643357234713b98b268b6cfa6f533321f862ded43e79c762d0 WatchSource:0}: Error finding container 127071a6b4376a643357234713b98b268b6cfa6f533321f862ded43e79c762d0: Status 404 returned error can't find the container with id 127071a6b4376a643357234713b98b268b6cfa6f533321f862ded43e79c762d0 Apr 20 21:17:03.598235 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:03.598198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" event={"ID":"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7","Type":"ContainerStarted","Data":"127071a6b4376a643357234713b98b268b6cfa6f533321f862ded43e79c762d0"} Apr 20 21:17:11.622563 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:11.622524 2567 generic.go:358] "Generic (PLEG): container finished" podID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerID="f93827e920207f0080281b10d6eb290f35efc04b8db51541c08d8810bf7ead6f" exitCode=0 Apr 20 21:17:11.622912 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:11.622577 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" event={"ID":"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7","Type":"ContainerDied","Data":"f93827e920207f0080281b10d6eb290f35efc04b8db51541c08d8810bf7ead6f"} Apr 20 21:17:14.633473 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:14.633432 2567 generic.go:358] "Generic (PLEG): container finished" podID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerID="ebc3e26fe3946f113baa75286ea183663aa9ff7f513ccc1d76a13af25d83c9ec" exitCode=0 Apr 20 21:17:14.633840 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:14.633494 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" event={"ID":"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7","Type":"ContainerDied","Data":"ebc3e26fe3946f113baa75286ea183663aa9ff7f513ccc1d76a13af25d83c9ec"} Apr 20 21:17:23.661409 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:23.661373 2567 generic.go:358] "Generic (PLEG): container finished" podID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerID="0fb286593a9450b41d8bf0e3e042b5abee16813b98f7aa3fb0d78b2cab65c433" exitCode=0 Apr 20 21:17:23.661823 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:23.661433 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" event={"ID":"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7","Type":"ContainerDied","Data":"0fb286593a9450b41d8bf0e3e042b5abee16813b98f7aa3fb0d78b2cab65c433"} Apr 20 21:17:24.782312 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.782291 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:24.822484 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.822462 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsz9p\" (UniqueName: \"kubernetes.io/projected/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-kube-api-access-nsz9p\") pod \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " Apr 20 21:17:24.822587 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.822529 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-bundle\") pod \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " Apr 20 21:17:24.822587 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.822560 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-util\") pod \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\" (UID: \"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7\") " Apr 20 21:17:24.823185 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.823164 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-bundle" (OuterVolumeSpecName: "bundle") pod "a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" (UID: "a7c6ba84-9851-4d59-8bbc-1a230cbed5f7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:17:24.824653 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.824634 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-kube-api-access-nsz9p" (OuterVolumeSpecName: "kube-api-access-nsz9p") pod "a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" (UID: "a7c6ba84-9851-4d59-8bbc-1a230cbed5f7"). InnerVolumeSpecName "kube-api-access-nsz9p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:17:24.826886 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.826862 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-util" (OuterVolumeSpecName: "util") pod "a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" (UID: "a7c6ba84-9851-4d59-8bbc-1a230cbed5f7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:17:24.923287 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.923238 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-bundle\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:17:24.923287 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.923260 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-util\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:17:24.923287 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:24.923269 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsz9p\" (UniqueName: \"kubernetes.io/projected/a7c6ba84-9851-4d59-8bbc-1a230cbed5f7-kube-api-access-nsz9p\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:17:25.668022 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:25.667969 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" event={"ID":"a7c6ba84-9851-4d59-8bbc-1a230cbed5f7","Type":"ContainerDied","Data":"127071a6b4376a643357234713b98b268b6cfa6f533321f862ded43e79c762d0"} Apr 20 21:17:25.668022 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:25.668023 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="127071a6b4376a643357234713b98b268b6cfa6f533321f862ded43e79c762d0" Apr 20 21:17:25.668234 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:25.668038 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbsqq7" Apr 20 21:17:30.724214 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:30.724188 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 21:17:39.103973 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.103934 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8nxrz"] Apr 20 21:17:39.106351 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.104225 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerName="extract" Apr 20 21:17:39.106351 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.104236 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerName="extract" Apr 20 21:17:39.106351 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.104246 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerName="pull" Apr 20 21:17:39.106351 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.104251 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerName="pull" Apr 20 21:17:39.106351 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.104265 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerName="util" Apr 20 21:17:39.106351 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.104270 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerName="util" Apr 20 21:17:39.106351 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.104308 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7c6ba84-9851-4d59-8bbc-1a230cbed5f7" containerName="extract" Apr 20 21:17:39.107227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.107212 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" Apr 20 21:17:39.109376 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.109352 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 21:17:39.109494 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.109383 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-pssp4\"" Apr 20 21:17:39.110071 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.110056 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 21:17:39.115804 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.115784 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8nxrz"] Apr 20 21:17:39.230972 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.230928 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkblt\" (UniqueName: \"kubernetes.io/projected/e1fe8490-ef8f-4244-954f-6593a5a1e076-kube-api-access-tkblt\") pod \"cert-manager-cainjector-68b757865b-8nxrz\" (UID: \"e1fe8490-ef8f-4244-954f-6593a5a1e076\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" Apr 20 21:17:39.230972 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.230977 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1fe8490-ef8f-4244-954f-6593a5a1e076-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8nxrz\" (UID: \"e1fe8490-ef8f-4244-954f-6593a5a1e076\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" Apr 20 21:17:39.332311 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.332275 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkblt\" (UniqueName: \"kubernetes.io/projected/e1fe8490-ef8f-4244-954f-6593a5a1e076-kube-api-access-tkblt\") pod \"cert-manager-cainjector-68b757865b-8nxrz\" (UID: \"e1fe8490-ef8f-4244-954f-6593a5a1e076\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" Apr 20 21:17:39.332424 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.332318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1fe8490-ef8f-4244-954f-6593a5a1e076-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8nxrz\" (UID: \"e1fe8490-ef8f-4244-954f-6593a5a1e076\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" Apr 20 21:17:39.340362 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.340339 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkblt\" (UniqueName: \"kubernetes.io/projected/e1fe8490-ef8f-4244-954f-6593a5a1e076-kube-api-access-tkblt\") pod \"cert-manager-cainjector-68b757865b-8nxrz\" (UID: \"e1fe8490-ef8f-4244-954f-6593a5a1e076\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" Apr 20 21:17:39.340362 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.340345 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1fe8490-ef8f-4244-954f-6593a5a1e076-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8nxrz\" (UID: \"e1fe8490-ef8f-4244-954f-6593a5a1e076\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" Apr 20 21:17:39.416295 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.416241 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" Apr 20 21:17:39.531461 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.531307 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8nxrz"] Apr 20 21:17:39.533775 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:17:39.533739 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1fe8490_ef8f_4244_954f_6593a5a1e076.slice/crio-85b3e7d7d17d2d31d53ff10f0443aa404814a0a54222a9b89d150ffb1fb6c01b WatchSource:0}: Error finding container 85b3e7d7d17d2d31d53ff10f0443aa404814a0a54222a9b89d150ffb1fb6c01b: Status 404 returned error can't find the container with id 85b3e7d7d17d2d31d53ff10f0443aa404814a0a54222a9b89d150ffb1fb6c01b Apr 20 21:17:39.535578 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.535562 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:17:39.706410 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:39.706350 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" event={"ID":"e1fe8490-ef8f-4244-954f-6593a5a1e076","Type":"ContainerStarted","Data":"85b3e7d7d17d2d31d53ff10f0443aa404814a0a54222a9b89d150ffb1fb6c01b"} Apr 20 21:17:43.717740 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:43.717701 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" event={"ID":"e1fe8490-ef8f-4244-954f-6593a5a1e076","Type":"ContainerStarted","Data":"fd7e85abed6f03ec78c1cd14d588701958139233cbbaa9e28520a7200c04f52e"} Apr 20 21:17:43.734477 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:43.734426 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-8nxrz" podStartSLOduration=1.52720512 podStartE2EDuration="4.734412286s" podCreationTimestamp="2026-04-20 21:17:39 +0000 UTC" firstStartedPulling="2026-04-20 21:17:39.535686774 +0000 UTC m=+309.264655351" lastFinishedPulling="2026-04-20 21:17:42.742893936 +0000 UTC m=+312.471862517" observedRunningTime="2026-04-20 21:17:43.732920703 +0000 UTC m=+313.461889303" watchObservedRunningTime="2026-04-20 21:17:43.734412286 +0000 UTC m=+313.463380886" Apr 20 21:17:58.195259 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.195224 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf"] Apr 20 21:17:58.199165 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.199148 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" Apr 20 21:17:58.201531 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.201508 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 21:17:58.202342 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.202327 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:17:58.202396 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.202348 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-5qjxm\"" Apr 20 21:17:58.209137 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.209116 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf"] Apr 20 21:17:58.272330 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.272300 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b23b4958-e2b7-4866-90b5-761439aeea55-tmp\") pod \"openshift-lws-operator-bfc7f696d-lnxtf\" (UID: \"b23b4958-e2b7-4866-90b5-761439aeea55\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" Apr 20 21:17:58.272515 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.272346 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5jg7\" (UniqueName: \"kubernetes.io/projected/b23b4958-e2b7-4866-90b5-761439aeea55-kube-api-access-d5jg7\") pod \"openshift-lws-operator-bfc7f696d-lnxtf\" (UID: \"b23b4958-e2b7-4866-90b5-761439aeea55\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" Apr 20 21:17:58.372998 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.372957 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b23b4958-e2b7-4866-90b5-761439aeea55-tmp\") pod \"openshift-lws-operator-bfc7f696d-lnxtf\" (UID: \"b23b4958-e2b7-4866-90b5-761439aeea55\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" Apr 20 21:17:58.373174 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.373036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5jg7\" (UniqueName: \"kubernetes.io/projected/b23b4958-e2b7-4866-90b5-761439aeea55-kube-api-access-d5jg7\") pod \"openshift-lws-operator-bfc7f696d-lnxtf\" (UID: \"b23b4958-e2b7-4866-90b5-761439aeea55\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" Apr 20 21:17:58.373346 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.373325 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b23b4958-e2b7-4866-90b5-761439aeea55-tmp\") pod \"openshift-lws-operator-bfc7f696d-lnxtf\" (UID: \"b23b4958-e2b7-4866-90b5-761439aeea55\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" Apr 20 21:17:58.381891 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.381860 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5jg7\" (UniqueName: \"kubernetes.io/projected/b23b4958-e2b7-4866-90b5-761439aeea55-kube-api-access-d5jg7\") pod \"openshift-lws-operator-bfc7f696d-lnxtf\" (UID: \"b23b4958-e2b7-4866-90b5-761439aeea55\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" Apr 20 21:17:58.508138 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.508062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" Apr 20 21:17:58.621715 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.621684 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf"] Apr 20 21:17:58.624852 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:17:58.624825 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb23b4958_e2b7_4866_90b5_761439aeea55.slice/crio-9f24783e75327a861fc1c2e5ca8eca6d82b6d19e0c27040d3e554cc43d66b65c WatchSource:0}: Error finding container 9f24783e75327a861fc1c2e5ca8eca6d82b6d19e0c27040d3e554cc43d66b65c: Status 404 returned error can't find the container with id 9f24783e75327a861fc1c2e5ca8eca6d82b6d19e0c27040d3e554cc43d66b65c Apr 20 21:17:58.764106 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:17:58.764018 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" event={"ID":"b23b4958-e2b7-4866-90b5-761439aeea55","Type":"ContainerStarted","Data":"9f24783e75327a861fc1c2e5ca8eca6d82b6d19e0c27040d3e554cc43d66b65c"} Apr 20 21:18:01.774897 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:01.774860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" event={"ID":"b23b4958-e2b7-4866-90b5-761439aeea55","Type":"ContainerStarted","Data":"af50a705c0335767e91f537cf49472ec5ab359fd0d9c9947dca091edc8d8c7dd"} Apr 20 21:18:01.791002 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:01.790938 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-lnxtf" podStartSLOduration=1.325686914 podStartE2EDuration="3.790926195s" podCreationTimestamp="2026-04-20 21:17:58 +0000 UTC" firstStartedPulling="2026-04-20 21:17:58.626799604 +0000 UTC m=+328.355768185" lastFinishedPulling="2026-04-20 21:18:01.09203889 +0000 UTC m=+330.821007466" observedRunningTime="2026-04-20 21:18:01.789431968 +0000 UTC m=+331.518400567" watchObservedRunningTime="2026-04-20 21:18:01.790926195 +0000 UTC m=+331.519894794" Apr 20 21:18:04.107404 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.107367 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw"] Apr 20 21:18:04.110864 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.110848 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.113186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.113164 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wh4tl\"" Apr 20 21:18:04.113186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.113180 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:18:04.113916 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.113901 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:18:04.119436 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.119413 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw"] Apr 20 21:18:04.225900 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.225863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.226061 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.225915 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kx7k\" (UniqueName: \"kubernetes.io/projected/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-kube-api-access-7kx7k\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.226061 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.226045 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.327268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.327243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.327418 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.327281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.327418 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.327316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kx7k\" (UniqueName: \"kubernetes.io/projected/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-kube-api-access-7kx7k\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.327629 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.327612 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.327688 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.327673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.335436 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.335405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kx7k\" (UniqueName: \"kubernetes.io/projected/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-kube-api-access-7kx7k\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.421446 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.421368 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:04.537465 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.537437 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw"] Apr 20 21:18:04.540662 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:04.540638 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod700c206a_66d8_4b49_bc21_7ac7d62bb2e9.slice/crio-de776cde3590a5bb71ffdb69c7b99ca914801d308ffb43d675811112fba50597 WatchSource:0}: Error finding container de776cde3590a5bb71ffdb69c7b99ca914801d308ffb43d675811112fba50597: Status 404 returned error can't find the container with id de776cde3590a5bb71ffdb69c7b99ca914801d308ffb43d675811112fba50597 Apr 20 21:18:04.786841 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.786755 2567 generic.go:358] "Generic (PLEG): container finished" podID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerID="e49151048d4dfd957b1f72860111906d74641c721c1545d2e092403a56fa572b" exitCode=0 Apr 20 21:18:04.786841 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.786808 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" event={"ID":"700c206a-66d8-4b49-bc21-7ac7d62bb2e9","Type":"ContainerDied","Data":"e49151048d4dfd957b1f72860111906d74641c721c1545d2e092403a56fa572b"} Apr 20 21:18:04.786841 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:04.786834 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" event={"ID":"700c206a-66d8-4b49-bc21-7ac7d62bb2e9","Type":"ContainerStarted","Data":"de776cde3590a5bb71ffdb69c7b99ca914801d308ffb43d675811112fba50597"} Apr 20 21:18:05.791678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:05.791594 2567 generic.go:358] "Generic (PLEG): container finished" podID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerID="d06ea330c499974645f64bbe60c0387e12a880568eb0fa7c99a67b89b5683b89" exitCode=0 Apr 20 21:18:05.791678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:05.791664 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" event={"ID":"700c206a-66d8-4b49-bc21-7ac7d62bb2e9","Type":"ContainerDied","Data":"d06ea330c499974645f64bbe60c0387e12a880568eb0fa7c99a67b89b5683b89"} Apr 20 21:18:06.796636 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:06.796603 2567 generic.go:358] "Generic (PLEG): container finished" podID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerID="a164f0af2851bba137f9649087a22b3abd1ba5af431e8ce6d646de0b49fd4336" exitCode=0 Apr 20 21:18:06.797023 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:06.796650 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" event={"ID":"700c206a-66d8-4b49-bc21-7ac7d62bb2e9","Type":"ContainerDied","Data":"a164f0af2851bba137f9649087a22b3abd1ba5af431e8ce6d646de0b49fd4336"} Apr 20 21:18:07.919532 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:07.919508 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:08.061701 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.057217 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-bundle\") pod \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " Apr 20 21:18:08.061701 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.057298 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kx7k\" (UniqueName: \"kubernetes.io/projected/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-kube-api-access-7kx7k\") pod \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " Apr 20 21:18:08.061701 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.057351 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-util\") pod \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\" (UID: \"700c206a-66d8-4b49-bc21-7ac7d62bb2e9\") " Apr 20 21:18:08.062618 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.062584 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-bundle" (OuterVolumeSpecName: "bundle") pod "700c206a-66d8-4b49-bc21-7ac7d62bb2e9" (UID: "700c206a-66d8-4b49-bc21-7ac7d62bb2e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:18:08.064115 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.064092 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-kube-api-access-7kx7k" (OuterVolumeSpecName: "kube-api-access-7kx7k") pod "700c206a-66d8-4b49-bc21-7ac7d62bb2e9" (UID: "700c206a-66d8-4b49-bc21-7ac7d62bb2e9"). InnerVolumeSpecName "kube-api-access-7kx7k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:18:08.064890 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.064869 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-util" (OuterVolumeSpecName: "util") pod "700c206a-66d8-4b49-bc21-7ac7d62bb2e9" (UID: "700c206a-66d8-4b49-bc21-7ac7d62bb2e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:18:08.158333 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.158295 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-util\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:08.158333 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.158331 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-bundle\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:08.158526 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.158345 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7kx7k\" (UniqueName: \"kubernetes.io/projected/700c206a-66d8-4b49-bc21-7ac7d62bb2e9-kube-api-access-7kx7k\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:08.804131 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.804104 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" Apr 20 21:18:08.807071 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.807033 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5d9whw" event={"ID":"700c206a-66d8-4b49-bc21-7ac7d62bb2e9","Type":"ContainerDied","Data":"de776cde3590a5bb71ffdb69c7b99ca914801d308ffb43d675811112fba50597"} Apr 20 21:18:08.807071 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:08.807072 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de776cde3590a5bb71ffdb69c7b99ca914801d308ffb43d675811112fba50597" Apr 20 21:18:18.084523 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.084492 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79"] Apr 20 21:18:18.085081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.084892 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerName="util" Apr 20 21:18:18.085081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.084909 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerName="util" Apr 20 21:18:18.085081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.084937 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerName="pull" Apr 20 21:18:18.085081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.084963 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerName="pull" Apr 20 21:18:18.085081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.084978 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerName="extract" Apr 20 21:18:18.085081 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.085009 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerName="extract" Apr 20 21:18:18.085373 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.085094 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="700c206a-66d8-4b49-bc21-7ac7d62bb2e9" containerName="extract" Apr 20 21:18:18.090599 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.090579 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.092881 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.092856 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 21:18:18.092983 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.092876 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 21:18:18.093092 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.093073 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 21:18:18.093450 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.093435 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-hgbvg\"" Apr 20 21:18:18.093671 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.093657 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 21:18:18.102038 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.102017 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79"] Apr 20 21:18:18.136014 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.135955 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwjfj\" (UniqueName: \"kubernetes.io/projected/640c891e-2faa-4e30-bdd1-e531b6ec685f-kube-api-access-cwjfj\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.136177 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.136027 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/640c891e-2faa-4e30-bdd1-e531b6ec685f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.136177 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.136072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/640c891e-2faa-4e30-bdd1-e531b6ec685f-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.237339 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.237302 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/640c891e-2faa-4e30-bdd1-e531b6ec685f-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.237527 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.237362 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwjfj\" (UniqueName: \"kubernetes.io/projected/640c891e-2faa-4e30-bdd1-e531b6ec685f-kube-api-access-cwjfj\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.237527 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.237409 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/640c891e-2faa-4e30-bdd1-e531b6ec685f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.239799 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.239775 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/640c891e-2faa-4e30-bdd1-e531b6ec685f-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.239923 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.239901 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/640c891e-2faa-4e30-bdd1-e531b6ec685f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.245668 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.245639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwjfj\" (UniqueName: \"kubernetes.io/projected/640c891e-2faa-4e30-bdd1-e531b6ec685f-kube-api-access-cwjfj\") pod \"opendatahub-operator-controller-manager-85fc55dd88-wpc79\" (UID: \"640c891e-2faa-4e30-bdd1-e531b6ec685f\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.402943 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.402915 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:18.531703 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.531676 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79"] Apr 20 21:18:18.534727 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:18.534699 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod640c891e_2faa_4e30_bdd1_e531b6ec685f.slice/crio-45ff87e19625ce4a34a4a0a851753c0e234f57cc9c222481155b21c03a7db816 WatchSource:0}: Error finding container 45ff87e19625ce4a34a4a0a851753c0e234f57cc9c222481155b21c03a7db816: Status 404 returned error can't find the container with id 45ff87e19625ce4a34a4a0a851753c0e234f57cc9c222481155b21c03a7db816 Apr 20 21:18:18.565737 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.565705 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg"] Apr 20 21:18:18.572055 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.572019 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.574773 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.574751 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:18:18.574883 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.574818 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wh4tl\"" Apr 20 21:18:18.575106 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.575091 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:18:18.577419 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.577401 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg"] Apr 20 21:18:18.640557 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.640526 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.640702 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.640566 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5ccm\" (UniqueName: \"kubernetes.io/projected/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-kube-api-access-p5ccm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.640702 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.640584 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.741799 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.741727 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.741799 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.741767 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5ccm\" (UniqueName: \"kubernetes.io/projected/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-kube-api-access-p5ccm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.741799 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.741784 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.742177 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.742161 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.742218 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.742183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.753710 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.753682 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5ccm\" (UniqueName: \"kubernetes.io/projected/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-kube-api-access-p5ccm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:18.829211 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.829183 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" event={"ID":"640c891e-2faa-4e30-bdd1-e531b6ec685f","Type":"ContainerStarted","Data":"45ff87e19625ce4a34a4a0a851753c0e234f57cc9c222481155b21c03a7db816"} Apr 20 21:18:18.882150 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:18.882127 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:19.000947 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:19.000911 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg"] Apr 20 21:18:19.003045 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:19.003017 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1aaf6cf_67c8_4f94_bce7_d74f2aa4d917.slice/crio-c91439d80fd5a92dc51dd52283644f60b7c7f310a6f2b4cad4b717e631187272 WatchSource:0}: Error finding container c91439d80fd5a92dc51dd52283644f60b7c7f310a6f2b4cad4b717e631187272: Status 404 returned error can't find the container with id c91439d80fd5a92dc51dd52283644f60b7c7f310a6f2b4cad4b717e631187272 Apr 20 21:18:19.834852 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:19.834819 2567 generic.go:358] "Generic (PLEG): container finished" podID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerID="24905ee0a14b456b2d1773bb83343a58474ce3990b8923f3b59908347cf86adf" exitCode=0 Apr 20 21:18:19.835318 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:19.834877 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" event={"ID":"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917","Type":"ContainerDied","Data":"24905ee0a14b456b2d1773bb83343a58474ce3990b8923f3b59908347cf86adf"} Apr 20 21:18:19.835318 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:19.834905 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" event={"ID":"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917","Type":"ContainerStarted","Data":"c91439d80fd5a92dc51dd52283644f60b7c7f310a6f2b4cad4b717e631187272"} Apr 20 21:18:21.842231 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:21.842202 2567 generic.go:358] "Generic (PLEG): container finished" podID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerID="7033b717ee0801af198145f8b82913e887030a2b2749cd410393449b9edf8fec" exitCode=0 Apr 20 21:18:21.842644 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:21.842287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" event={"ID":"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917","Type":"ContainerDied","Data":"7033b717ee0801af198145f8b82913e887030a2b2749cd410393449b9edf8fec"} Apr 20 21:18:21.843667 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:21.843635 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" event={"ID":"640c891e-2faa-4e30-bdd1-e531b6ec685f","Type":"ContainerStarted","Data":"51b93fd5ca6142ef402c82c4a2649c92055938d9e95e67fdfea56647e4aa86d6"} Apr 20 21:18:21.843796 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:21.843755 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:21.888905 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:21.888372 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" podStartSLOduration=1.320843387 podStartE2EDuration="3.888337531s" podCreationTimestamp="2026-04-20 21:18:18 +0000 UTC" firstStartedPulling="2026-04-20 21:18:18.536568607 +0000 UTC m=+348.265537185" lastFinishedPulling="2026-04-20 21:18:21.104062752 +0000 UTC m=+350.833031329" observedRunningTime="2026-04-20 21:18:21.8851804 +0000 UTC m=+351.614148997" watchObservedRunningTime="2026-04-20 21:18:21.888337531 +0000 UTC m=+351.617306134" Apr 20 21:18:22.848574 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:22.848539 2567 generic.go:358] "Generic (PLEG): container finished" podID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerID="378f7ea62c91bb573bd43f717e3f0db77fec5e516c1a6be8823aa8d288feb24c" exitCode=0 Apr 20 21:18:22.848944 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:22.848626 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" event={"ID":"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917","Type":"ContainerDied","Data":"378f7ea62c91bb573bd43f717e3f0db77fec5e516c1a6be8823aa8d288feb24c"} Apr 20 21:18:23.967824 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:23.967804 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:23.985320 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:23.985295 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-bundle\") pod \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " Apr 20 21:18:23.985472 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:23.985334 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-util\") pod \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " Apr 20 21:18:23.985472 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:23.985368 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5ccm\" (UniqueName: \"kubernetes.io/projected/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-kube-api-access-p5ccm\") pod \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\" (UID: \"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917\") " Apr 20 21:18:23.986729 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:23.986687 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-bundle" (OuterVolumeSpecName: "bundle") pod "e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" (UID: "e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:18:23.987981 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:23.987958 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-kube-api-access-p5ccm" (OuterVolumeSpecName: "kube-api-access-p5ccm") pod "e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" (UID: "e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917"). InnerVolumeSpecName "kube-api-access-p5ccm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:18:23.993407 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:23.993380 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-util" (OuterVolumeSpecName: "util") pod "e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" (UID: "e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:18:24.086545 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:24.086515 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-bundle\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:24.086545 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:24.086539 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-util\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:24.086545 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:24.086550 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5ccm\" (UniqueName: \"kubernetes.io/projected/e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917-kube-api-access-p5ccm\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:24.856530 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:24.856505 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" Apr 20 21:18:24.856686 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:24.856527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97wthg" event={"ID":"e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917","Type":"ContainerDied","Data":"c91439d80fd5a92dc51dd52283644f60b7c7f310a6f2b4cad4b717e631187272"} Apr 20 21:18:24.856686 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:24.856558 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c91439d80fd5a92dc51dd52283644f60b7c7f310a6f2b4cad4b717e631187272" Apr 20 21:18:32.850935 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:32.850907 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-wpc79" Apr 20 21:18:36.994738 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.994705 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd"] Apr 20 21:18:36.995098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.995007 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerName="extract" Apr 20 21:18:36.995098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.995018 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerName="extract" Apr 20 21:18:36.995098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.995028 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerName="util" Apr 20 21:18:36.995098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.995034 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerName="util" Apr 20 21:18:36.995098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.995041 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerName="pull" Apr 20 21:18:36.995098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.995047 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerName="pull" Apr 20 21:18:36.995098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.995091 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1aaf6cf-67c8-4f94-bce7-d74f2aa4d917" containerName="extract" Apr 20 21:18:36.999306 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:36.999286 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.001566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.001547 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 21:18:37.002124 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.002098 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 21:18:37.003075 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.003056 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 21:18:37.003163 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.003069 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 21:18:37.003163 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.003073 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-9lwtz\"" Apr 20 21:18:37.010878 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.010856 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd"] Apr 20 21:18:37.083732 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.083705 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmrf6\" (UniqueName: \"kubernetes.io/projected/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-kube-api-access-xmrf6\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.083831 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.083738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-tmp\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.083831 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.083802 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-tls-certs\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.184572 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.184545 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-tls-certs\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.184735 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.184599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmrf6\" (UniqueName: \"kubernetes.io/projected/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-kube-api-access-xmrf6\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.184735 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.184630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-tmp\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.186939 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.186905 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-tmp\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.187075 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.187056 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-tls-certs\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.192505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.192484 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmrf6\" (UniqueName: \"kubernetes.io/projected/d2ea1a05-eff3-4922-9ed4-03c04ccf987c-kube-api-access-xmrf6\") pod \"kube-auth-proxy-b57dc9cf9-qjjgd\" (UID: \"d2ea1a05-eff3-4922-9ed4-03c04ccf987c\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.309459 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.309380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" Apr 20 21:18:37.428682 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.428617 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd"] Apr 20 21:18:37.431037 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:37.431006 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ea1a05_eff3_4922_9ed4_03c04ccf987c.slice/crio-901e11b74304415a83d7fe57bc315f90a2a46a5838a62ace53a11d04743ea4b0 WatchSource:0}: Error finding container 901e11b74304415a83d7fe57bc315f90a2a46a5838a62ace53a11d04743ea4b0: Status 404 returned error can't find the container with id 901e11b74304415a83d7fe57bc315f90a2a46a5838a62ace53a11d04743ea4b0 Apr 20 21:18:37.905633 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:37.905598 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" event={"ID":"d2ea1a05-eff3-4922-9ed4-03c04ccf987c","Type":"ContainerStarted","Data":"901e11b74304415a83d7fe57bc315f90a2a46a5838a62ace53a11d04743ea4b0"} Apr 20 21:18:38.712021 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.711973 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g"] Apr 20 21:18:38.715965 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.715946 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.718346 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.718324 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:18:38.718458 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.718354 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wh4tl\"" Apr 20 21:18:38.718458 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.718438 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:18:38.723463 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.723441 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g"] Apr 20 21:18:38.798667 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.798634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.798828 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.798687 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9tn4\" (UniqueName: \"kubernetes.io/projected/de67314e-9e0d-45b8-a00c-4808b85f7f26-kube-api-access-g9tn4\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.798828 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.798744 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.899910 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.899870 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.900142 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.900007 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.900142 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.900052 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9tn4\" (UniqueName: \"kubernetes.io/projected/de67314e-9e0d-45b8-a00c-4808b85f7f26-kube-api-access-g9tn4\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.900310 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.900286 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.900384 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.900339 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:38.908195 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:38.908171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9tn4\" (UniqueName: \"kubernetes.io/projected/de67314e-9e0d-45b8-a00c-4808b85f7f26-kube-api-access-g9tn4\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:39.028726 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:39.028649 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:39.469276 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:39.469242 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g"] Apr 20 21:18:39.471741 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:39.471710 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde67314e_9e0d_45b8_a00c_4808b85f7f26.slice/crio-268b348b2bf3c097738039ce477ff46891897e44783c5b085250bb65ebdc6a4d WatchSource:0}: Error finding container 268b348b2bf3c097738039ce477ff46891897e44783c5b085250bb65ebdc6a4d: Status 404 returned error can't find the container with id 268b348b2bf3c097738039ce477ff46891897e44783c5b085250bb65ebdc6a4d Apr 20 21:18:39.914611 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:39.914582 2567 generic.go:358] "Generic (PLEG): container finished" podID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerID="c87769ceeb882faf5c895203da55ef8bcd59fa11947eaab332ea3b89dba10e56" exitCode=0 Apr 20 21:18:39.915035 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:39.914676 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" event={"ID":"de67314e-9e0d-45b8-a00c-4808b85f7f26","Type":"ContainerDied","Data":"c87769ceeb882faf5c895203da55ef8bcd59fa11947eaab332ea3b89dba10e56"} Apr 20 21:18:39.915035 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:39.914713 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" event={"ID":"de67314e-9e0d-45b8-a00c-4808b85f7f26","Type":"ContainerStarted","Data":"268b348b2bf3c097738039ce477ff46891897e44783c5b085250bb65ebdc6a4d"} Apr 20 21:18:40.083604 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.083556 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-kp466"] Apr 20 21:18:40.086193 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.086170 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:40.088547 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.088522 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 21:18:40.088872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.088852 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-k2fn9\"" Apr 20 21:18:40.093610 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.093587 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-kp466"] Apr 20 21:18:40.210238 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.210157 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzwh\" (UniqueName: \"kubernetes.io/projected/f82b7fb7-1330-451e-8bb2-65948b52b19c-kube-api-access-pnzwh\") pod \"odh-model-controller-858dbf95b8-kp466\" (UID: \"f82b7fb7-1330-451e-8bb2-65948b52b19c\") " pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:40.210238 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.210231 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert\") pod \"odh-model-controller-858dbf95b8-kp466\" (UID: \"f82b7fb7-1330-451e-8bb2-65948b52b19c\") " pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:40.311380 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.311327 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzwh\" (UniqueName: \"kubernetes.io/projected/f82b7fb7-1330-451e-8bb2-65948b52b19c-kube-api-access-pnzwh\") pod \"odh-model-controller-858dbf95b8-kp466\" (UID: \"f82b7fb7-1330-451e-8bb2-65948b52b19c\") " pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:40.311564 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.311433 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert\") pod \"odh-model-controller-858dbf95b8-kp466\" (UID: \"f82b7fb7-1330-451e-8bb2-65948b52b19c\") " pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:40.311619 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:40.311596 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 21:18:40.311653 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:40.311644 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert podName:f82b7fb7-1330-451e-8bb2-65948b52b19c nodeName:}" failed. No retries permitted until 2026-04-20 21:18:40.811629382 +0000 UTC m=+370.540597962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert") pod "odh-model-controller-858dbf95b8-kp466" (UID: "f82b7fb7-1330-451e-8bb2-65948b52b19c") : secret "odh-model-controller-webhook-cert" not found Apr 20 21:18:40.322951 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.322898 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzwh\" (UniqueName: \"kubernetes.io/projected/f82b7fb7-1330-451e-8bb2-65948b52b19c-kube-api-access-pnzwh\") pod \"odh-model-controller-858dbf95b8-kp466\" (UID: \"f82b7fb7-1330-451e-8bb2-65948b52b19c\") " pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:40.817157 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.817125 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert\") pod \"odh-model-controller-858dbf95b8-kp466\" (UID: \"f82b7fb7-1330-451e-8bb2-65948b52b19c\") " pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:40.817312 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:40.817282 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 21:18:40.817354 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:40.817344 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert podName:f82b7fb7-1330-451e-8bb2-65948b52b19c nodeName:}" failed. No retries permitted until 2026-04-20 21:18:41.817327696 +0000 UTC m=+371.546296276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert") pod "odh-model-controller-858dbf95b8-kp466" (UID: "f82b7fb7-1330-451e-8bb2-65948b52b19c") : secret "odh-model-controller-webhook-cert" not found Apr 20 21:18:40.920278 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.920246 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" event={"ID":"d2ea1a05-eff3-4922-9ed4-03c04ccf987c","Type":"ContainerStarted","Data":"9d83fd96b53bba148a4134faeff7ac5b4069edbbea7c39c1633a392bfb871b71"} Apr 20 21:18:40.921713 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.921690 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" event={"ID":"de67314e-9e0d-45b8-a00c-4808b85f7f26","Type":"ContainerStarted","Data":"e5aa25271b964c73159797807aba5f3841b4c74e1e63b37427d286a6152100ad"} Apr 20 21:18:40.935702 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:40.935659 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-qjjgd" podStartSLOduration=1.6850498790000001 podStartE2EDuration="4.935646408s" podCreationTimestamp="2026-04-20 21:18:36 +0000 UTC" firstStartedPulling="2026-04-20 21:18:37.433180327 +0000 UTC m=+367.162148907" lastFinishedPulling="2026-04-20 21:18:40.683776855 +0000 UTC m=+370.412745436" observedRunningTime="2026-04-20 21:18:40.934507861 +0000 UTC m=+370.663476463" watchObservedRunningTime="2026-04-20 21:18:40.935646408 +0000 UTC m=+370.664615007" Apr 20 21:18:41.826475 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:41.826434 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert\") pod \"odh-model-controller-858dbf95b8-kp466\" (UID: \"f82b7fb7-1330-451e-8bb2-65948b52b19c\") " pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:41.829033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:41.829002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82b7fb7-1330-451e-8bb2-65948b52b19c-cert\") pod \"odh-model-controller-858dbf95b8-kp466\" (UID: \"f82b7fb7-1330-451e-8bb2-65948b52b19c\") " pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:41.901127 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:41.901086 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:41.927782 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:41.927748 2567 generic.go:358] "Generic (PLEG): container finished" podID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerID="e5aa25271b964c73159797807aba5f3841b4c74e1e63b37427d286a6152100ad" exitCode=0 Apr 20 21:18:41.927782 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:41.927782 2567 generic.go:358] "Generic (PLEG): container finished" podID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerID="7bc4514dd37027654a1658cd7d1abe55708721501a0e4aea5c81bdc49336894b" exitCode=0 Apr 20 21:18:41.928183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:41.927818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" event={"ID":"de67314e-9e0d-45b8-a00c-4808b85f7f26","Type":"ContainerDied","Data":"e5aa25271b964c73159797807aba5f3841b4c74e1e63b37427d286a6152100ad"} Apr 20 21:18:41.928183 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:41.927868 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" event={"ID":"de67314e-9e0d-45b8-a00c-4808b85f7f26","Type":"ContainerDied","Data":"7bc4514dd37027654a1658cd7d1abe55708721501a0e4aea5c81bdc49336894b"} Apr 20 21:18:42.021589 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:42.021558 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-kp466"] Apr 20 21:18:42.025057 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:42.025031 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82b7fb7_1330_451e_8bb2_65948b52b19c.slice/crio-8439005fa3a92dd08f34c9b0ffda4fdc6ff3000e0b2f2d1d26fd3d125dbd3727 WatchSource:0}: Error finding container 8439005fa3a92dd08f34c9b0ffda4fdc6ff3000e0b2f2d1d26fd3d125dbd3727: Status 404 returned error can't find the container with id 8439005fa3a92dd08f34c9b0ffda4fdc6ff3000e0b2f2d1d26fd3d125dbd3727 Apr 20 21:18:42.933736 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:42.933696 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" event={"ID":"f82b7fb7-1330-451e-8bb2-65948b52b19c","Type":"ContainerStarted","Data":"8439005fa3a92dd08f34c9b0ffda4fdc6ff3000e0b2f2d1d26fd3d125dbd3727"} Apr 20 21:18:43.064128 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.064102 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:43.137581 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.137551 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-bundle\") pod \"de67314e-9e0d-45b8-a00c-4808b85f7f26\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " Apr 20 21:18:43.137759 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.137594 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9tn4\" (UniqueName: \"kubernetes.io/projected/de67314e-9e0d-45b8-a00c-4808b85f7f26-kube-api-access-g9tn4\") pod \"de67314e-9e0d-45b8-a00c-4808b85f7f26\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " Apr 20 21:18:43.137759 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.137618 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-util\") pod \"de67314e-9e0d-45b8-a00c-4808b85f7f26\" (UID: \"de67314e-9e0d-45b8-a00c-4808b85f7f26\") " Apr 20 21:18:43.138686 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.138656 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-bundle" (OuterVolumeSpecName: "bundle") pod "de67314e-9e0d-45b8-a00c-4808b85f7f26" (UID: "de67314e-9e0d-45b8-a00c-4808b85f7f26"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:18:43.140301 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.140275 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de67314e-9e0d-45b8-a00c-4808b85f7f26-kube-api-access-g9tn4" (OuterVolumeSpecName: "kube-api-access-g9tn4") pod "de67314e-9e0d-45b8-a00c-4808b85f7f26" (UID: "de67314e-9e0d-45b8-a00c-4808b85f7f26"). InnerVolumeSpecName "kube-api-access-g9tn4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:18:43.143812 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.143773 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-util" (OuterVolumeSpecName: "util") pod "de67314e-9e0d-45b8-a00c-4808b85f7f26" (UID: "de67314e-9e0d-45b8-a00c-4808b85f7f26"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:18:43.238664 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.238537 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-bundle\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:43.238664 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.238575 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9tn4\" (UniqueName: \"kubernetes.io/projected/de67314e-9e0d-45b8-a00c-4808b85f7f26-kube-api-access-g9tn4\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:43.238664 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.238591 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de67314e-9e0d-45b8-a00c-4808b85f7f26-util\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:43.941748 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.941708 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" event={"ID":"de67314e-9e0d-45b8-a00c-4808b85f7f26","Type":"ContainerDied","Data":"268b348b2bf3c097738039ce477ff46891897e44783c5b085250bb65ebdc6a4d"} Apr 20 21:18:43.941748 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.941749 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="268b348b2bf3c097738039ce477ff46891897e44783c5b085250bb65ebdc6a4d" Apr 20 21:18:43.942290 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:43.941796 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835szm9g" Apr 20 21:18:44.947062 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:44.947029 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" event={"ID":"f82b7fb7-1330-451e-8bb2-65948b52b19c","Type":"ContainerStarted","Data":"8966b1266a7d8538c54e53876ee3cf0158b42cc28455c0da83c262b89ba5d59c"} Apr 20 21:18:44.947508 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:44.947156 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:44.970625 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:44.970577 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" podStartSLOduration=2.144906492 podStartE2EDuration="4.97056309s" podCreationTimestamp="2026-04-20 21:18:40 +0000 UTC" firstStartedPulling="2026-04-20 21:18:42.026237076 +0000 UTC m=+371.755205657" lastFinishedPulling="2026-04-20 21:18:44.851893674 +0000 UTC m=+374.580862255" observedRunningTime="2026-04-20 21:18:44.968123914 +0000 UTC m=+374.697092514" watchObservedRunningTime="2026-04-20 21:18:44.97056309 +0000 UTC m=+374.699531689" Apr 20 21:18:45.951126 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:45.951089 2567 generic.go:358] "Generic (PLEG): container finished" podID="f82b7fb7-1330-451e-8bb2-65948b52b19c" containerID="8966b1266a7d8538c54e53876ee3cf0158b42cc28455c0da83c262b89ba5d59c" exitCode=1 Apr 20 21:18:45.951573 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:45.951179 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" event={"ID":"f82b7fb7-1330-451e-8bb2-65948b52b19c","Type":"ContainerDied","Data":"8966b1266a7d8538c54e53876ee3cf0158b42cc28455c0da83c262b89ba5d59c"} Apr 20 21:18:45.951573 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:45.951468 2567 scope.go:117] "RemoveContainer" containerID="8966b1266a7d8538c54e53876ee3cf0158b42cc28455c0da83c262b89ba5d59c" Apr 20 21:18:46.196882 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.196848 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-cxmzf"] Apr 20 21:18:46.197229 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.197211 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerName="pull" Apr 20 21:18:46.197229 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.197226 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerName="pull" Apr 20 21:18:46.197343 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.197238 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerName="util" Apr 20 21:18:46.197343 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.197243 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerName="util" Apr 20 21:18:46.197343 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.197261 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerName="extract" Apr 20 21:18:46.197343 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.197267 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerName="extract" Apr 20 21:18:46.197343 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.197312 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="de67314e-9e0d-45b8-a00c-4808b85f7f26" containerName="extract" Apr 20 21:18:46.199154 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.199138 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:46.201319 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.201257 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 21:18:46.201418 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.201349 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-lpxq4\"" Apr 20 21:18:46.209489 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.209452 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-cxmzf"] Apr 20 21:18:46.264105 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.264082 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ec111a7-2378-4ead-875a-06574d248b03-cert\") pod \"kserve-controller-manager-856948b99f-cxmzf\" (UID: \"6ec111a7-2378-4ead-875a-06574d248b03\") " pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:46.264193 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.264134 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696g2\" (UniqueName: \"kubernetes.io/projected/6ec111a7-2378-4ead-875a-06574d248b03-kube-api-access-696g2\") pod \"kserve-controller-manager-856948b99f-cxmzf\" (UID: \"6ec111a7-2378-4ead-875a-06574d248b03\") " pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:46.364675 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.364643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-696g2\" (UniqueName: \"kubernetes.io/projected/6ec111a7-2378-4ead-875a-06574d248b03-kube-api-access-696g2\") pod \"kserve-controller-manager-856948b99f-cxmzf\" (UID: \"6ec111a7-2378-4ead-875a-06574d248b03\") " pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:46.364851 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.364710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ec111a7-2378-4ead-875a-06574d248b03-cert\") pod \"kserve-controller-manager-856948b99f-cxmzf\" (UID: \"6ec111a7-2378-4ead-875a-06574d248b03\") " pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:46.364851 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:46.364802 2567 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 21:18:46.364935 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:46.364855 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ec111a7-2378-4ead-875a-06574d248b03-cert podName:6ec111a7-2378-4ead-875a-06574d248b03 nodeName:}" failed. No retries permitted until 2026-04-20 21:18:46.864838915 +0000 UTC m=+376.593807492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ec111a7-2378-4ead-875a-06574d248b03-cert") pod "kserve-controller-manager-856948b99f-cxmzf" (UID: "6ec111a7-2378-4ead-875a-06574d248b03") : secret "kserve-webhook-server-cert" not found Apr 20 21:18:46.375805 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.375774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-696g2\" (UniqueName: \"kubernetes.io/projected/6ec111a7-2378-4ead-875a-06574d248b03-kube-api-access-696g2\") pod \"kserve-controller-manager-856948b99f-cxmzf\" (UID: \"6ec111a7-2378-4ead-875a-06574d248b03\") " pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:46.869960 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.869916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ec111a7-2378-4ead-875a-06574d248b03-cert\") pod \"kserve-controller-manager-856948b99f-cxmzf\" (UID: \"6ec111a7-2378-4ead-875a-06574d248b03\") " pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:46.872921 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.872892 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ec111a7-2378-4ead-875a-06574d248b03-cert\") pod \"kserve-controller-manager-856948b99f-cxmzf\" (UID: \"6ec111a7-2378-4ead-875a-06574d248b03\") " pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:46.955728 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.955696 2567 generic.go:358] "Generic (PLEG): container finished" podID="f82b7fb7-1330-451e-8bb2-65948b52b19c" containerID="22ee65e924435ac5747a4d160508eef74361ea57ac2e38f663fc7775a6afa551" exitCode=1 Apr 20 21:18:46.956130 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.955728 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" event={"ID":"f82b7fb7-1330-451e-8bb2-65948b52b19c","Type":"ContainerDied","Data":"22ee65e924435ac5747a4d160508eef74361ea57ac2e38f663fc7775a6afa551"} Apr 20 21:18:46.956130 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.955774 2567 scope.go:117] "RemoveContainer" containerID="8966b1266a7d8538c54e53876ee3cf0158b42cc28455c0da83c262b89ba5d59c" Apr 20 21:18:46.956130 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:46.956040 2567 scope.go:117] "RemoveContainer" containerID="22ee65e924435ac5747a4d160508eef74361ea57ac2e38f663fc7775a6afa551" Apr 20 21:18:46.956282 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:46.956243 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-kp466_opendatahub(f82b7fb7-1330-451e-8bb2-65948b52b19c)\"" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" podUID="f82b7fb7-1330-451e-8bb2-65948b52b19c" Apr 20 21:18:47.110527 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:47.110500 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:47.239137 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:47.239109 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-cxmzf"] Apr 20 21:18:47.241612 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:47.241583 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec111a7_2378_4ead_875a_06574d248b03.slice/crio-85f1600ac04bce5a6b71653a6fe5f110c348ff8bc266bce5011aa42acd5e59a4 WatchSource:0}: Error finding container 85f1600ac04bce5a6b71653a6fe5f110c348ff8bc266bce5011aa42acd5e59a4: Status 404 returned error can't find the container with id 85f1600ac04bce5a6b71653a6fe5f110c348ff8bc266bce5011aa42acd5e59a4 Apr 20 21:18:47.961498 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:47.961471 2567 scope.go:117] "RemoveContainer" containerID="22ee65e924435ac5747a4d160508eef74361ea57ac2e38f663fc7775a6afa551" Apr 20 21:18:47.962044 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:47.961681 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-kp466_opendatahub(f82b7fb7-1330-451e-8bb2-65948b52b19c)\"" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" podUID="f82b7fb7-1330-451e-8bb2-65948b52b19c" Apr 20 21:18:47.962632 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:47.962608 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" event={"ID":"6ec111a7-2378-4ead-875a-06574d248b03","Type":"ContainerStarted","Data":"85f1600ac04bce5a6b71653a6fe5f110c348ff8bc266bce5011aa42acd5e59a4"} Apr 20 21:18:49.970059 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:49.970023 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" event={"ID":"6ec111a7-2378-4ead-875a-06574d248b03","Type":"ContainerStarted","Data":"98e765c22ea152e3a6ed954f08d4a99f4ad4a4ef80146bbf436de11ba7b47a5f"} Apr 20 21:18:49.970407 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:49.970164 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:18:49.988528 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:49.988451 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" podStartSLOduration=1.496436098 podStartE2EDuration="3.988438995s" podCreationTimestamp="2026-04-20 21:18:46 +0000 UTC" firstStartedPulling="2026-04-20 21:18:47.242926551 +0000 UTC m=+376.971895129" lastFinishedPulling="2026-04-20 21:18:49.734929446 +0000 UTC m=+379.463898026" observedRunningTime="2026-04-20 21:18:49.986826461 +0000 UTC m=+379.715795081" watchObservedRunningTime="2026-04-20 21:18:49.988438995 +0000 UTC m=+379.717407594" Apr 20 21:18:53.507856 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.507809 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2"] Apr 20 21:18:53.511074 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.511050 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.513686 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.513662 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wh4tl\"" Apr 20 21:18:53.513871 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.513783 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:18:53.514424 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.514402 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:18:53.521262 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.521231 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2"] Apr 20 21:18:53.624729 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.624700 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n588w\" (UniqueName: \"kubernetes.io/projected/a62e35f3-6f45-4493-ae43-3e3919a597c0-kube-api-access-n588w\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.624853 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.624741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.624853 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.624770 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.725865 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.725835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n588w\" (UniqueName: \"kubernetes.io/projected/a62e35f3-6f45-4493-ae43-3e3919a597c0-kube-api-access-n588w\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.726019 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.725872 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.726019 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.725890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.726311 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.726290 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.726348 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.726304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.733822 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.733798 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n588w\" (UniqueName: \"kubernetes.io/projected/a62e35f3-6f45-4493-ae43-3e3919a597c0-kube-api-access-n588w\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.822188 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.822127 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:53.954846 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.954814 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2"] Apr 20 21:18:53.958047 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:53.958019 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda62e35f3_6f45_4493_ae43_3e3919a597c0.slice/crio-d8415050a425a10fb5859fcd22e8b257019cd099a65681ae73c7b24c374c5af3 WatchSource:0}: Error finding container d8415050a425a10fb5859fcd22e8b257019cd099a65681ae73c7b24c374c5af3: Status 404 returned error can't find the container with id d8415050a425a10fb5859fcd22e8b257019cd099a65681ae73c7b24c374c5af3 Apr 20 21:18:53.983282 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:53.983255 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" event={"ID":"a62e35f3-6f45-4493-ae43-3e3919a597c0","Type":"ContainerStarted","Data":"d8415050a425a10fb5859fcd22e8b257019cd099a65681ae73c7b24c374c5af3"} Apr 20 21:18:54.947399 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:54.947371 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:18:54.947728 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:54.947701 2567 scope.go:117] "RemoveContainer" containerID="22ee65e924435ac5747a4d160508eef74361ea57ac2e38f663fc7775a6afa551" Apr 20 21:18:54.947870 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:18:54.947853 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-kp466_opendatahub(f82b7fb7-1330-451e-8bb2-65948b52b19c)\"" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" podUID="f82b7fb7-1330-451e-8bb2-65948b52b19c" Apr 20 21:18:54.987170 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:54.987140 2567 generic.go:358] "Generic (PLEG): container finished" podID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerID="71c7308b30ad9a8c1974d7138bada2cbdcb597bec4174347cd6cd5d86671634d" exitCode=0 Apr 20 21:18:54.987271 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:54.987198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" event={"ID":"a62e35f3-6f45-4493-ae43-3e3919a597c0","Type":"ContainerDied","Data":"71c7308b30ad9a8c1974d7138bada2cbdcb597bec4174347cd6cd5d86671634d"} Apr 20 21:18:55.803875 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.801769 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z"] Apr 20 21:18:55.806462 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.806445 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:55.808819 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.808804 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 21:18:55.808928 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.808910 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 21:18:55.809048 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.809029 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-wznbf\"" Apr 20 21:18:55.824562 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.824537 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z"] Apr 20 21:18:55.946709 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.946680 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmnh\" (UniqueName: \"kubernetes.io/projected/5f6ca10e-449a-479e-b9b7-661327ff1c62-kube-api-access-9mmnh\") pod \"servicemesh-operator3-55f49c5f94-pdb8z\" (UID: \"5f6ca10e-449a-479e-b9b7-661327ff1c62\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:55.946845 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.946785 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5f6ca10e-449a-479e-b9b7-661327ff1c62-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pdb8z\" (UID: \"5f6ca10e-449a-479e-b9b7-661327ff1c62\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:55.992522 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.992464 2567 generic.go:358] "Generic (PLEG): container finished" podID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerID="e206d0552308833be1074b31d33dd89c113cfeb8a9c0e5bae8a75941ef7a4d88" exitCode=0 Apr 20 21:18:55.992522 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:55.992513 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" event={"ID":"a62e35f3-6f45-4493-ae43-3e3919a597c0","Type":"ContainerDied","Data":"e206d0552308833be1074b31d33dd89c113cfeb8a9c0e5bae8a75941ef7a4d88"} Apr 20 21:18:56.047869 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:56.047847 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmnh\" (UniqueName: \"kubernetes.io/projected/5f6ca10e-449a-479e-b9b7-661327ff1c62-kube-api-access-9mmnh\") pod \"servicemesh-operator3-55f49c5f94-pdb8z\" (UID: \"5f6ca10e-449a-479e-b9b7-661327ff1c62\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:56.047939 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:56.047892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5f6ca10e-449a-479e-b9b7-661327ff1c62-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pdb8z\" (UID: \"5f6ca10e-449a-479e-b9b7-661327ff1c62\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:56.050275 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:56.050253 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5f6ca10e-449a-479e-b9b7-661327ff1c62-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pdb8z\" (UID: \"5f6ca10e-449a-479e-b9b7-661327ff1c62\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:56.057309 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:56.057288 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmnh\" (UniqueName: \"kubernetes.io/projected/5f6ca10e-449a-479e-b9b7-661327ff1c62-kube-api-access-9mmnh\") pod \"servicemesh-operator3-55f49c5f94-pdb8z\" (UID: \"5f6ca10e-449a-479e-b9b7-661327ff1c62\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:56.150898 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:56.150868 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:56.274509 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:56.274484 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z"] Apr 20 21:18:56.276389 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:18:56.276362 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f6ca10e_449a_479e_b9b7_661327ff1c62.slice/crio-faa0bb2563154f64d706cf330ece76b5e4c9616b7ea4d19b999cd17204764bd0 WatchSource:0}: Error finding container faa0bb2563154f64d706cf330ece76b5e4c9616b7ea4d19b999cd17204764bd0: Status 404 returned error can't find the container with id faa0bb2563154f64d706cf330ece76b5e4c9616b7ea4d19b999cd17204764bd0 Apr 20 21:18:56.998791 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:56.998753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" event={"ID":"5f6ca10e-449a-479e-b9b7-661327ff1c62","Type":"ContainerStarted","Data":"faa0bb2563154f64d706cf330ece76b5e4c9616b7ea4d19b999cd17204764bd0"} Apr 20 21:18:57.001197 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:57.001168 2567 generic.go:358] "Generic (PLEG): container finished" podID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerID="ef4338b95cf06cfca7cae3f05269f1efd90ae73264d782b3bc7c753e4009d042" exitCode=0 Apr 20 21:18:57.001330 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:57.001207 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" event={"ID":"a62e35f3-6f45-4493-ae43-3e3919a597c0","Type":"ContainerDied","Data":"ef4338b95cf06cfca7cae3f05269f1efd90ae73264d782b3bc7c753e4009d042"} Apr 20 21:18:58.697892 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.697869 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:58.773269 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.773250 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-util\") pod \"a62e35f3-6f45-4493-ae43-3e3919a597c0\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " Apr 20 21:18:58.773386 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.773302 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-bundle\") pod \"a62e35f3-6f45-4493-ae43-3e3919a597c0\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " Apr 20 21:18:58.773386 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.773324 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n588w\" (UniqueName: \"kubernetes.io/projected/a62e35f3-6f45-4493-ae43-3e3919a597c0-kube-api-access-n588w\") pod \"a62e35f3-6f45-4493-ae43-3e3919a597c0\" (UID: \"a62e35f3-6f45-4493-ae43-3e3919a597c0\") " Apr 20 21:18:58.774422 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.774391 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-bundle" (OuterVolumeSpecName: "bundle") pod "a62e35f3-6f45-4493-ae43-3e3919a597c0" (UID: "a62e35f3-6f45-4493-ae43-3e3919a597c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:18:58.775416 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.775393 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62e35f3-6f45-4493-ae43-3e3919a597c0-kube-api-access-n588w" (OuterVolumeSpecName: "kube-api-access-n588w") pod "a62e35f3-6f45-4493-ae43-3e3919a597c0" (UID: "a62e35f3-6f45-4493-ae43-3e3919a597c0"). InnerVolumeSpecName "kube-api-access-n588w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:18:58.778421 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.778398 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-util" (OuterVolumeSpecName: "util") pod "a62e35f3-6f45-4493-ae43-3e3919a597c0" (UID: "a62e35f3-6f45-4493-ae43-3e3919a597c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:18:58.874653 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.874618 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-util\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:58.874653 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.874655 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a62e35f3-6f45-4493-ae43-3e3919a597c0-bundle\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:58.874926 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:58.874670 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n588w\" (UniqueName: \"kubernetes.io/projected/a62e35f3-6f45-4493-ae43-3e3919a597c0-kube-api-access-n588w\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:18:59.010087 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:59.009962 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" event={"ID":"5f6ca10e-449a-479e-b9b7-661327ff1c62","Type":"ContainerStarted","Data":"e7454cb04d59fbfcd773206c51318b798f21c6b4f77cd571a7e1223664a0d53b"} Apr 20 21:18:59.010724 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:59.010700 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:18:59.012796 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:59.012770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" event={"ID":"a62e35f3-6f45-4493-ae43-3e3919a597c0","Type":"ContainerDied","Data":"d8415050a425a10fb5859fcd22e8b257019cd099a65681ae73c7b24c374c5af3"} Apr 20 21:18:59.012909 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:59.012805 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8415050a425a10fb5859fcd22e8b257019cd099a65681ae73c7b24c374c5af3" Apr 20 21:18:59.012909 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:59.012861 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2255q2" Apr 20 21:18:59.032906 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:18:59.032859 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" podStartSLOduration=1.5593627589999999 podStartE2EDuration="4.03284374s" podCreationTimestamp="2026-04-20 21:18:55 +0000 UTC" firstStartedPulling="2026-04-20 21:18:56.278842112 +0000 UTC m=+386.007810689" lastFinishedPulling="2026-04-20 21:18:58.752323089 +0000 UTC m=+388.481291670" observedRunningTime="2026-04-20 21:18:59.031515304 +0000 UTC m=+388.760483904" watchObservedRunningTime="2026-04-20 21:18:59.03284374 +0000 UTC m=+388.761812339" Apr 20 21:19:01.641651 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.641587 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p"] Apr 20 21:19:01.642141 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.642121 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerName="util" Apr 20 21:19:01.642141 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.642141 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerName="util" Apr 20 21:19:01.642305 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.642151 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerName="pull" Apr 20 21:19:01.642305 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.642159 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerName="pull" Apr 20 21:19:01.642305 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.642172 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerName="extract" Apr 20 21:19:01.642305 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.642178 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerName="extract" Apr 20 21:19:01.642305 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.642263 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a62e35f3-6f45-4493-ae43-3e3919a597c0" containerName="extract" Apr 20 21:19:01.645381 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.645358 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.648612 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.648589 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 21:19:01.648695 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.648632 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 21:19:01.648695 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.648664 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-2bw4s\"" Apr 20 21:19:01.648872 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.648859 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 21:19:01.648912 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.648881 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 21:19:01.670489 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.670466 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p"] Apr 20 21:19:01.799331 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.799289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.799331 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.799326 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk47v\" (UniqueName: \"kubernetes.io/projected/502928a8-1d20-4943-a5c1-077a4b81c99e-kube-api-access-mk47v\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.799570 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.799411 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.799570 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.799425 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.799570 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.799480 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.799570 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.799557 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.799767 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.799598 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/502928a8-1d20-4943-a5c1-077a4b81c99e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.900970 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.900890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.900970 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.900936 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.901179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.900971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.901179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.901035 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.901179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.901077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/502928a8-1d20-4943-a5c1-077a4b81c99e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.901179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.901109 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.901179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.901135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk47v\" (UniqueName: \"kubernetes.io/projected/502928a8-1d20-4943-a5c1-077a4b81c99e-kube-api-access-mk47v\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.901410 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.901272 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:19:01.902272 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.901697 2567 scope.go:117] "RemoveContainer" containerID="22ee65e924435ac5747a4d160508eef74361ea57ac2e38f663fc7775a6afa551" Apr 20 21:19:01.902272 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.902208 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.904307 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.904159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.904307 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.904255 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.904307 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.904265 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/502928a8-1d20-4943-a5c1-077a4b81c99e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.904447 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.904353 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/502928a8-1d20-4943-a5c1-077a4b81c99e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.908694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.908671 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/502928a8-1d20-4943-a5c1-077a4b81c99e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.909109 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.909088 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk47v\" (UniqueName: \"kubernetes.io/projected/502928a8-1d20-4943-a5c1-077a4b81c99e-kube-api-access-mk47v\") pod \"istiod-openshift-gateway-55ff986f96-h9k8p\" (UID: \"502928a8-1d20-4943-a5c1-077a4b81c99e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:01.954369 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:01.954345 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:02.115481 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:02.112728 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p"] Apr 20 21:19:02.115481 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:19:02.115099 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod502928a8_1d20_4943_a5c1_077a4b81c99e.slice/crio-09bec7d87747a664a9a07ee6e43e2a71d1cf48c83e32de67c06ec2b9c53b2912 WatchSource:0}: Error finding container 09bec7d87747a664a9a07ee6e43e2a71d1cf48c83e32de67c06ec2b9c53b2912: Status 404 returned error can't find the container with id 09bec7d87747a664a9a07ee6e43e2a71d1cf48c83e32de67c06ec2b9c53b2912 Apr 20 21:19:03.030373 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:03.030339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" event={"ID":"f82b7fb7-1330-451e-8bb2-65948b52b19c","Type":"ContainerStarted","Data":"9189493073e6e14467f3eb64f71e81b42fe3abb0bd7e1eb40fc53229cdfe19e6"} Apr 20 21:19:03.030866 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:03.030845 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:19:03.031998 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:03.031961 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" event={"ID":"502928a8-1d20-4943-a5c1-077a4b81c99e","Type":"ContainerStarted","Data":"09bec7d87747a664a9a07ee6e43e2a71d1cf48c83e32de67c06ec2b9c53b2912"} Apr 20 21:19:04.724335 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:04.724291 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 21:19:04.724701 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:04.724358 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 21:19:05.040394 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:05.040295 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" event={"ID":"502928a8-1d20-4943-a5c1-077a4b81c99e","Type":"ContainerStarted","Data":"cd3b27407177fc5222a7413f08bf38d602e9dc6da66add5cecae9c4d52995cf4"} Apr 20 21:19:05.040588 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:05.040560 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:05.042232 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:05.042208 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-h9k8p container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 21:19:05.042325 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:05.042255 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" podUID="502928a8-1d20-4943-a5c1-077a4b81c99e" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 21:19:05.060622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:05.060578 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" podStartSLOduration=1.455605688 podStartE2EDuration="4.060565847s" podCreationTimestamp="2026-04-20 21:19:01 +0000 UTC" firstStartedPulling="2026-04-20 21:19:02.11905867 +0000 UTC m=+391.848027251" lastFinishedPulling="2026-04-20 21:19:04.72401883 +0000 UTC m=+394.452987410" observedRunningTime="2026-04-20 21:19:05.058570114 +0000 UTC m=+394.787538717" watchObservedRunningTime="2026-04-20 21:19:05.060565847 +0000 UTC m=+394.789534445" Apr 20 21:19:06.044089 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:06.044060 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-h9k8p container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 21:19:06.044462 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:06.044123 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" podUID="502928a8-1d20-4943-a5c1-077a4b81c99e" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 21:19:09.044378 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:09.044343 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9k8p" Apr 20 21:19:11.023246 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:11.023214 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pdb8z" Apr 20 21:19:14.037653 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:14.037622 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-kp466" Apr 20 21:19:20.978567 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:20.978534 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-cxmzf" Apr 20 21:19:37.880928 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:37.880891 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bp6zs"] Apr 20 21:19:37.887866 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:37.887846 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:37.890120 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:37.890096 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:19:37.890344 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:37.890320 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:19:37.890659 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:37.890639 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bp6zs"] Apr 20 21:19:37.890841 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:37.890824 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-rddfd\"" Apr 20 21:19:37.996198 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:37.996175 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jssng\" (UniqueName: \"kubernetes.io/projected/86e1a7e0-4997-4d3b-aac9-1505b67a27e3-kube-api-access-jssng\") pod \"kuadrant-operator-catalog-bp6zs\" (UID: \"86e1a7e0-4997-4d3b-aac9-1505b67a27e3\") " pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:38.097362 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:38.097332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jssng\" (UniqueName: \"kubernetes.io/projected/86e1a7e0-4997-4d3b-aac9-1505b67a27e3-kube-api-access-jssng\") pod \"kuadrant-operator-catalog-bp6zs\" (UID: \"86e1a7e0-4997-4d3b-aac9-1505b67a27e3\") " pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:38.104904 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:38.104880 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jssng\" (UniqueName: \"kubernetes.io/projected/86e1a7e0-4997-4d3b-aac9-1505b67a27e3-kube-api-access-jssng\") pod \"kuadrant-operator-catalog-bp6zs\" (UID: \"86e1a7e0-4997-4d3b-aac9-1505b67a27e3\") " pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:38.198039 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:38.197970 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:38.314579 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:38.314553 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bp6zs"] Apr 20 21:19:38.316413 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:19:38.316375 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e1a7e0_4997_4d3b_aac9_1505b67a27e3.slice/crio-2b2e379e9bb6d40bcf6ff8d7e52359400b42e8c3deaa03854c9b156c744841ac WatchSource:0}: Error finding container 2b2e379e9bb6d40bcf6ff8d7e52359400b42e8c3deaa03854c9b156c744841ac: Status 404 returned error can't find the container with id 2b2e379e9bb6d40bcf6ff8d7e52359400b42e8c3deaa03854c9b156c744841ac Apr 20 21:19:39.161905 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:39.161871 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" event={"ID":"86e1a7e0-4997-4d3b-aac9-1505b67a27e3","Type":"ContainerStarted","Data":"2b2e379e9bb6d40bcf6ff8d7e52359400b42e8c3deaa03854c9b156c744841ac"} Apr 20 21:19:41.169833 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:41.169798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" event={"ID":"86e1a7e0-4997-4d3b-aac9-1505b67a27e3","Type":"ContainerStarted","Data":"88574b7be5aafe1c5fffb799e35b1b96a9cbe9fe2aeb957a11cc555efdafee6d"} Apr 20 21:19:41.185099 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:41.185057 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" podStartSLOduration=2.314812871 podStartE2EDuration="4.185042991s" podCreationTimestamp="2026-04-20 21:19:37 +0000 UTC" firstStartedPulling="2026-04-20 21:19:38.317632246 +0000 UTC m=+428.046600823" lastFinishedPulling="2026-04-20 21:19:40.187862362 +0000 UTC m=+429.916830943" observedRunningTime="2026-04-20 21:19:41.182205109 +0000 UTC m=+430.911173710" watchObservedRunningTime="2026-04-20 21:19:41.185042991 +0000 UTC m=+430.914011591" Apr 20 21:19:48.198366 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:48.198327 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:48.198774 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:48.198381 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:48.220036 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:48.220013 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:49.225098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:49.225069 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-bp6zs" Apr 20 21:19:54.916021 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:54.915930 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw"] Apr 20 21:19:54.919343 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:54.919327 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:54.921590 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:54.921572 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-tx77r\"" Apr 20 21:19:54.926698 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:54.926625 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw"] Apr 20 21:19:55.029600 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.029572 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjns\" (UniqueName: \"kubernetes.io/projected/837d9f42-ca4b-414b-b8f2-1670d9c6f266-kube-api-access-2pjns\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.029741 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.029606 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.029741 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.029624 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.130054 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.130024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjns\" (UniqueName: \"kubernetes.io/projected/837d9f42-ca4b-414b-b8f2-1670d9c6f266-kube-api-access-2pjns\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.130054 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.130057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.130248 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.130079 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.130460 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.130441 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.130529 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.130468 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.137766 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.137748 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjns\" (UniqueName: \"kubernetes.io/projected/837d9f42-ca4b-414b-b8f2-1670d9c6f266-kube-api-access-2pjns\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.229240 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.229188 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:55.346483 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:55.346457 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw"] Apr 20 21:19:55.348301 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:19:55.348276 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod837d9f42_ca4b_414b_b8f2_1670d9c6f266.slice/crio-fd28dcedceaa4c2a43e78a65879a9cde18e505c233d930527ac48e9185f39e31 WatchSource:0}: Error finding container fd28dcedceaa4c2a43e78a65879a9cde18e505c233d930527ac48e9185f39e31: Status 404 returned error can't find the container with id fd28dcedceaa4c2a43e78a65879a9cde18e505c233d930527ac48e9185f39e31 Apr 20 21:19:56.229100 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:56.229062 2567 generic.go:358] "Generic (PLEG): container finished" podID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerID="8d4552882d6a707429c0b86d927aeb6087f8c92a56aa0fb9de1fcbc56e58302c" exitCode=0 Apr 20 21:19:56.229526 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:56.229133 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" event={"ID":"837d9f42-ca4b-414b-b8f2-1670d9c6f266","Type":"ContainerDied","Data":"8d4552882d6a707429c0b86d927aeb6087f8c92a56aa0fb9de1fcbc56e58302c"} Apr 20 21:19:56.229526 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:56.229158 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" event={"ID":"837d9f42-ca4b-414b-b8f2-1670d9c6f266","Type":"ContainerStarted","Data":"fd28dcedceaa4c2a43e78a65879a9cde18e505c233d930527ac48e9185f39e31"} Apr 20 21:19:57.234238 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:57.234212 2567 generic.go:358] "Generic (PLEG): container finished" podID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerID="fcf7f017507d61b7633fa784dd245868cad4624dcfe29b842effd9cda34a9df4" exitCode=0 Apr 20 21:19:57.234527 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:57.234296 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" event={"ID":"837d9f42-ca4b-414b-b8f2-1670d9c6f266","Type":"ContainerDied","Data":"fcf7f017507d61b7633fa784dd245868cad4624dcfe29b842effd9cda34a9df4"} Apr 20 21:19:58.240033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:58.240003 2567 generic.go:358] "Generic (PLEG): container finished" podID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerID="377ba432293f5222f3714278b24ac2376b2419705dd4092e058f72cc370adff4" exitCode=0 Apr 20 21:19:58.240394 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:58.240090 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" event={"ID":"837d9f42-ca4b-414b-b8f2-1670d9c6f266","Type":"ContainerDied","Data":"377ba432293f5222f3714278b24ac2376b2419705dd4092e058f72cc370adff4"} Apr 20 21:19:59.363253 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.363228 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:19:59.462037 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.462008 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-bundle\") pod \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " Apr 20 21:19:59.462214 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.462083 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pjns\" (UniqueName: \"kubernetes.io/projected/837d9f42-ca4b-414b-b8f2-1670d9c6f266-kube-api-access-2pjns\") pod \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " Apr 20 21:19:59.462214 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.462121 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-util\") pod \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\" (UID: \"837d9f42-ca4b-414b-b8f2-1670d9c6f266\") " Apr 20 21:19:59.462595 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.462566 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-bundle" (OuterVolumeSpecName: "bundle") pod "837d9f42-ca4b-414b-b8f2-1670d9c6f266" (UID: "837d9f42-ca4b-414b-b8f2-1670d9c6f266"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:19:59.464562 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.464535 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837d9f42-ca4b-414b-b8f2-1670d9c6f266-kube-api-access-2pjns" (OuterVolumeSpecName: "kube-api-access-2pjns") pod "837d9f42-ca4b-414b-b8f2-1670d9c6f266" (UID: "837d9f42-ca4b-414b-b8f2-1670d9c6f266"). InnerVolumeSpecName "kube-api-access-2pjns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:19:59.467806 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.467783 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-util" (OuterVolumeSpecName: "util") pod "837d9f42-ca4b-414b-b8f2-1670d9c6f266" (UID: "837d9f42-ca4b-414b-b8f2-1670d9c6f266"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:19:59.562850 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.562782 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-bundle\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:19:59.562850 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.562807 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pjns\" (UniqueName: \"kubernetes.io/projected/837d9f42-ca4b-414b-b8f2-1670d9c6f266-kube-api-access-2pjns\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:19:59.562850 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:19:59.562817 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/837d9f42-ca4b-414b-b8f2-1670d9c6f266-util\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:00.249274 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:00.249243 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" event={"ID":"837d9f42-ca4b-414b-b8f2-1670d9c6f266","Type":"ContainerDied","Data":"fd28dcedceaa4c2a43e78a65879a9cde18e505c233d930527ac48e9185f39e31"} Apr 20 21:20:00.249274 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:00.249276 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd28dcedceaa4c2a43e78a65879a9cde18e505c233d930527ac48e9185f39e31" Apr 20 21:20:00.249274 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:00.249278 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw" Apr 20 21:20:10.137229 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.137197 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-859ff44bcd-tlr2d"] Apr 20 21:20:10.137677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.137539 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerName="pull" Apr 20 21:20:10.137677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.137551 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerName="pull" Apr 20 21:20:10.137677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.137561 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerName="extract" Apr 20 21:20:10.137677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.137566 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerName="extract" Apr 20 21:20:10.137677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.137589 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerName="util" Apr 20 21:20:10.137677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.137594 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerName="util" Apr 20 21:20:10.137677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.137656 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="837d9f42-ca4b-414b-b8f2-1670d9c6f266" containerName="extract" Apr 20 21:20:10.143085 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.143067 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.151517 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.151497 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859ff44bcd-tlr2d"] Apr 20 21:20:10.245013 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.244962 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f721edfa-67f3-4f75-b7da-038490a83e97-console-oauth-config\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.245167 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.245043 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-trusted-ca-bundle\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.245167 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.245068 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f721edfa-67f3-4f75-b7da-038490a83e97-console-serving-cert\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.245167 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.245089 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-service-ca\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.245167 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.245106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-oauth-serving-cert\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.245167 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.245145 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975rh\" (UniqueName: \"kubernetes.io/projected/f721edfa-67f3-4f75-b7da-038490a83e97-kube-api-access-975rh\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.245350 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.245242 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-console-config\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.345821 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.345789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f721edfa-67f3-4f75-b7da-038490a83e97-console-serving-cert\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.345975 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.345832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-service-ca\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.345975 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.345860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-oauth-serving-cert\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.345975 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.345886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-975rh\" (UniqueName: \"kubernetes.io/projected/f721edfa-67f3-4f75-b7da-038490a83e97-kube-api-access-975rh\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.345975 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.345962 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-console-config\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.346195 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.346012 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f721edfa-67f3-4f75-b7da-038490a83e97-console-oauth-config\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.346195 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.346057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-trusted-ca-bundle\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.346634 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.346605 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-service-ca\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.346933 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.346635 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-oauth-serving-cert\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.346933 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.346803 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-trusted-ca-bundle\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.346933 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.346885 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f721edfa-67f3-4f75-b7da-038490a83e97-console-config\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.348507 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.348485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f721edfa-67f3-4f75-b7da-038490a83e97-console-serving-cert\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.348709 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.348688 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f721edfa-67f3-4f75-b7da-038490a83e97-console-oauth-config\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.353842 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.353826 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-975rh\" (UniqueName: \"kubernetes.io/projected/f721edfa-67f3-4f75-b7da-038490a83e97-kube-api-access-975rh\") pod \"console-859ff44bcd-tlr2d\" (UID: \"f721edfa-67f3-4f75-b7da-038490a83e97\") " pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.452550 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.452488 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:10.571893 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:10.571869 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859ff44bcd-tlr2d"] Apr 20 21:20:10.573754 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:20:10.573724 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf721edfa_67f3_4f75_b7da_038490a83e97.slice/crio-05bd4d76e417bce33fe7b36673c17362faa4e4fcdbc6252102bf7fe3304ba0e9 WatchSource:0}: Error finding container 05bd4d76e417bce33fe7b36673c17362faa4e4fcdbc6252102bf7fe3304ba0e9: Status 404 returned error can't find the container with id 05bd4d76e417bce33fe7b36673c17362faa4e4fcdbc6252102bf7fe3304ba0e9 Apr 20 21:20:11.182950 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.182915 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-tvwzd"] Apr 20 21:20:11.185160 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.185144 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" Apr 20 21:20:11.188414 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.188392 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-4kj9d\"" Apr 20 21:20:11.209257 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.209234 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-tvwzd"] Apr 20 21:20:11.291236 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.291202 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859ff44bcd-tlr2d" event={"ID":"f721edfa-67f3-4f75-b7da-038490a83e97","Type":"ContainerStarted","Data":"da5199c09220f300c0466ee2a7cdac67eed49aaf70ab43598d306af7aa44a339"} Apr 20 21:20:11.291372 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.291244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859ff44bcd-tlr2d" event={"ID":"f721edfa-67f3-4f75-b7da-038490a83e97","Type":"ContainerStarted","Data":"05bd4d76e417bce33fe7b36673c17362faa4e4fcdbc6252102bf7fe3304ba0e9"} Apr 20 21:20:11.313350 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.313310 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-859ff44bcd-tlr2d" podStartSLOduration=1.313298778 podStartE2EDuration="1.313298778s" podCreationTimestamp="2026-04-20 21:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:20:11.31137269 +0000 UTC m=+461.040341289" watchObservedRunningTime="2026-04-20 21:20:11.313298778 +0000 UTC m=+461.042267376" Apr 20 21:20:11.352844 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.352817 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6q6\" (UniqueName: \"kubernetes.io/projected/2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd-kube-api-access-nz6q6\") pod \"authorino-operator-657f44b778-tvwzd\" (UID: \"2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd\") " pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" Apr 20 21:20:11.453210 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.453152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6q6\" (UniqueName: \"kubernetes.io/projected/2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd-kube-api-access-nz6q6\") pod \"authorino-operator-657f44b778-tvwzd\" (UID: \"2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd\") " pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" Apr 20 21:20:11.467430 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.467397 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6q6\" (UniqueName: \"kubernetes.io/projected/2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd-kube-api-access-nz6q6\") pod \"authorino-operator-657f44b778-tvwzd\" (UID: \"2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd\") " pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" Apr 20 21:20:11.495152 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.495131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" Apr 20 21:20:11.620399 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:11.620375 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-tvwzd"] Apr 20 21:20:11.622415 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:20:11.622385 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d5adbef_9c5f_40fe_903a_5b7fc3f0c1bd.slice/crio-516f67a60bffcda10a8079c42db255b7c0ba078a670e267e7e54cda5ccf17929 WatchSource:0}: Error finding container 516f67a60bffcda10a8079c42db255b7c0ba078a670e267e7e54cda5ccf17929: Status 404 returned error can't find the container with id 516f67a60bffcda10a8079c42db255b7c0ba078a670e267e7e54cda5ccf17929 Apr 20 21:20:12.297503 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:12.297455 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" event={"ID":"2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd","Type":"ContainerStarted","Data":"516f67a60bffcda10a8079c42db255b7c0ba078a670e267e7e54cda5ccf17929"} Apr 20 21:20:13.301893 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:13.301861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" event={"ID":"2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd","Type":"ContainerStarted","Data":"a70ccb0d6aa06932dc84743a2385b885b7e0db42e507dba24512df5aeaf01a23"} Apr 20 21:20:13.302250 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:13.302027 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" Apr 20 21:20:13.320186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:13.320067 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" podStartSLOduration=0.829574943 podStartE2EDuration="2.320052094s" podCreationTimestamp="2026-04-20 21:20:11 +0000 UTC" firstStartedPulling="2026-04-20 21:20:11.624391719 +0000 UTC m=+461.353360297" lastFinishedPulling="2026-04-20 21:20:13.11486887 +0000 UTC m=+462.843837448" observedRunningTime="2026-04-20 21:20:13.319909608 +0000 UTC m=+463.048878208" watchObservedRunningTime="2026-04-20 21:20:13.320052094 +0000 UTC m=+463.049020694" Apr 20 21:20:20.452616 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:20.452574 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:20.452616 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:20.452615 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:20.457117 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:20.457092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:21.334217 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:21.334188 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-859ff44bcd-tlr2d" Apr 20 21:20:21.460527 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:21.460492 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cc87fbcf9-826x6"] Apr 20 21:20:24.307870 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:24.307838 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-tvwzd" Apr 20 21:20:45.622733 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.622696 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn"] Apr 20 21:20:45.625370 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.625352 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:45.628423 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.628404 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-wnf2g\"" Apr 20 21:20:45.637161 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.637139 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn"] Apr 20 21:20:45.744453 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.744423 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/270b3af9-da05-4943-9846-e119f2b26ea9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" (UID: \"270b3af9-da05-4943-9846-e119f2b26ea9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:45.744453 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.744455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kv2v\" (UniqueName: \"kubernetes.io/projected/270b3af9-da05-4943-9846-e119f2b26ea9-kube-api-access-8kv2v\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" (UID: \"270b3af9-da05-4943-9846-e119f2b26ea9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:45.845311 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.845281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/270b3af9-da05-4943-9846-e119f2b26ea9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" (UID: \"270b3af9-da05-4943-9846-e119f2b26ea9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:45.845311 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.845310 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kv2v\" (UniqueName: \"kubernetes.io/projected/270b3af9-da05-4943-9846-e119f2b26ea9-kube-api-access-8kv2v\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" (UID: \"270b3af9-da05-4943-9846-e119f2b26ea9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:45.845651 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.845633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/270b3af9-da05-4943-9846-e119f2b26ea9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" (UID: \"270b3af9-da05-4943-9846-e119f2b26ea9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:45.866449 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.866423 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kv2v\" (UniqueName: \"kubernetes.io/projected/270b3af9-da05-4943-9846-e119f2b26ea9-kube-api-access-8kv2v\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" (UID: \"270b3af9-da05-4943-9846-e119f2b26ea9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:45.935050 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:45.934947 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:46.065104 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.065076 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn"] Apr 20 21:20:46.066683 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:20:46.066659 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod270b3af9_da05_4943_9846_e119f2b26ea9.slice/crio-cded1c9118fd4fd82d55db112d05191073b8214c70f2c132b6748100d7ff4d2d WatchSource:0}: Error finding container cded1c9118fd4fd82d55db112d05191073b8214c70f2c132b6748100d7ff4d2d: Status 404 returned error can't find the container with id cded1c9118fd4fd82d55db112d05191073b8214c70f2c132b6748100d7ff4d2d Apr 20 21:20:46.343337 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.343301 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn"] Apr 20 21:20:46.349949 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.349881 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn"] Apr 20 21:20:46.370696 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.370669 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x"] Apr 20 21:20:46.373522 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.373498 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:46.396874 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.396845 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x"] Apr 20 21:20:46.450923 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.450889 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x\" (UID: \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:46.451105 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.451034 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599gn\" (UniqueName: \"kubernetes.io/projected/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-kube-api-access-599gn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x\" (UID: \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:46.480172 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.480139 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5cc87fbcf9-826x6" podUID="261cec5b-ea8b-4b66-b5e7-2f33ae892080" containerName="console" containerID="cri-o://09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7" gracePeriod=15 Apr 20 21:20:46.552187 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.552149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-599gn\" (UniqueName: \"kubernetes.io/projected/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-kube-api-access-599gn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x\" (UID: \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:46.552377 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.552236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x\" (UID: \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:46.552666 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.552644 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x\" (UID: \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:46.562908 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.562877 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-599gn\" (UniqueName: \"kubernetes.io/projected/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-kube-api-access-599gn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x\" (UID: \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:46.686193 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.686125 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:46.737721 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.737693 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cc87fbcf9-826x6_261cec5b-ea8b-4b66-b5e7-2f33ae892080/console/0.log" Apr 20 21:20:46.737935 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.737768 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:20:46.860172 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860091 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-oauth-serving-cert\") pod \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " Apr 20 21:20:46.860172 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860138 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-serving-cert\") pod \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " Apr 20 21:20:46.860363 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860256 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-config\") pod \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " Apr 20 21:20:46.860363 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860291 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-service-ca\") pod \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " Apr 20 21:20:46.860363 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860329 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-oauth-config\") pod \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " Apr 20 21:20:46.860527 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860373 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gghqp\" (UniqueName: \"kubernetes.io/projected/261cec5b-ea8b-4b66-b5e7-2f33ae892080-kube-api-access-gghqp\") pod \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " Apr 20 21:20:46.860527 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860418 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-trusted-ca-bundle\") pod \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\" (UID: \"261cec5b-ea8b-4b66-b5e7-2f33ae892080\") " Apr 20 21:20:46.860672 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860640 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "261cec5b-ea8b-4b66-b5e7-2f33ae892080" (UID: "261cec5b-ea8b-4b66-b5e7-2f33ae892080"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:20:46.861349 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.860775 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-oauth-serving-cert\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:46.861349 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.861279 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-config" (OuterVolumeSpecName: "console-config") pod "261cec5b-ea8b-4b66-b5e7-2f33ae892080" (UID: "261cec5b-ea8b-4b66-b5e7-2f33ae892080"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:20:46.861349 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.861287 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "261cec5b-ea8b-4b66-b5e7-2f33ae892080" (UID: "261cec5b-ea8b-4b66-b5e7-2f33ae892080"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:20:46.861349 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.861301 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-service-ca" (OuterVolumeSpecName: "service-ca") pod "261cec5b-ea8b-4b66-b5e7-2f33ae892080" (UID: "261cec5b-ea8b-4b66-b5e7-2f33ae892080"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:20:46.862912 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.862887 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "261cec5b-ea8b-4b66-b5e7-2f33ae892080" (UID: "261cec5b-ea8b-4b66-b5e7-2f33ae892080"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:20:46.863148 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.862969 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "261cec5b-ea8b-4b66-b5e7-2f33ae892080" (UID: "261cec5b-ea8b-4b66-b5e7-2f33ae892080"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:20:46.863227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.863203 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261cec5b-ea8b-4b66-b5e7-2f33ae892080-kube-api-access-gghqp" (OuterVolumeSpecName: "kube-api-access-gghqp") pod "261cec5b-ea8b-4b66-b5e7-2f33ae892080" (UID: "261cec5b-ea8b-4b66-b5e7-2f33ae892080"). InnerVolumeSpecName "kube-api-access-gghqp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:20:46.872383 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.872308 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x"] Apr 20 21:20:46.875184 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:20:46.875154 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode957fb0e_4f83_4766_b4e3_3f0328d2cd12.slice/crio-91f54bd74b00a64ad2e90e9a1c99c3fe8d3cc5515922ac9db0c702d392bb23bc WatchSource:0}: Error finding container 91f54bd74b00a64ad2e90e9a1c99c3fe8d3cc5515922ac9db0c702d392bb23bc: Status 404 returned error can't find the container with id 91f54bd74b00a64ad2e90e9a1c99c3fe8d3cc5515922ac9db0c702d392bb23bc Apr 20 21:20:46.961393 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.961362 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-serving-cert\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:46.961393 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.961394 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-config\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:46.961666 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.961408 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-service-ca\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:46.961666 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.961417 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/261cec5b-ea8b-4b66-b5e7-2f33ae892080-console-oauth-config\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:46.961666 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.961426 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gghqp\" (UniqueName: \"kubernetes.io/projected/261cec5b-ea8b-4b66-b5e7-2f33ae892080-kube-api-access-gghqp\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:46.961666 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:46.961434 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/261cec5b-ea8b-4b66-b5e7-2f33ae892080-trusted-ca-bundle\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:47.431744 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.431713 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cc87fbcf9-826x6_261cec5b-ea8b-4b66-b5e7-2f33ae892080/console/0.log" Apr 20 21:20:47.431924 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.431766 2567 generic.go:358] "Generic (PLEG): container finished" podID="261cec5b-ea8b-4b66-b5e7-2f33ae892080" containerID="09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7" exitCode=2 Apr 20 21:20:47.432002 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.431974 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cc87fbcf9-826x6" Apr 20 21:20:47.432128 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.432053 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cc87fbcf9-826x6" event={"ID":"261cec5b-ea8b-4b66-b5e7-2f33ae892080","Type":"ContainerDied","Data":"09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7"} Apr 20 21:20:47.432128 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.432094 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cc87fbcf9-826x6" event={"ID":"261cec5b-ea8b-4b66-b5e7-2f33ae892080","Type":"ContainerDied","Data":"a14049352de773550d3c1e9616598171ee6344b67a35b01155c25e0632209cac"} Apr 20 21:20:47.432128 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.432109 2567 scope.go:117] "RemoveContainer" containerID="09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7" Apr 20 21:20:47.435675 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.435624 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" event={"ID":"e957fb0e-4f83-4766-b4e3-3f0328d2cd12","Type":"ContainerStarted","Data":"91f54bd74b00a64ad2e90e9a1c99c3fe8d3cc5515922ac9db0c702d392bb23bc"} Apr 20 21:20:47.443753 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.443734 2567 scope.go:117] "RemoveContainer" containerID="09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7" Apr 20 21:20:47.444074 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:20:47.444041 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7\": container with ID starting with 09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7 not found: ID does not exist" containerID="09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7" Apr 20 21:20:47.444179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.444078 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7"} err="failed to get container status \"09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7\": rpc error: code = NotFound desc = could not find container \"09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7\": container with ID starting with 09d56514d8ba320c6afccd92c0e650842258b5fea59d7572b48da38894ec4cf7 not found: ID does not exist" Apr 20 21:20:47.469014 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.466328 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cc87fbcf9-826x6"] Apr 20 21:20:47.476221 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:47.476191 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5cc87fbcf9-826x6"] Apr 20 21:20:48.809824 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:48.809705 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261cec5b-ea8b-4b66-b5e7-2f33ae892080" path="/var/lib/kubelet/pods/261cec5b-ea8b-4b66-b5e7-2f33ae892080/volumes" Apr 20 21:20:50.450490 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.450414 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" podUID="270b3af9-da05-4943-9846-e119f2b26ea9" containerName="manager" containerID="cri-o://2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa" gracePeriod=2 Apr 20 21:20:50.451887 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.451862 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" event={"ID":"e957fb0e-4f83-4766-b4e3-3f0328d2cd12","Type":"ContainerStarted","Data":"50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a"} Apr 20 21:20:50.452028 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.452016 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:20:50.476885 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.476828 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" podStartSLOduration=1.483255762 podStartE2EDuration="4.476808904s" podCreationTimestamp="2026-04-20 21:20:46 +0000 UTC" firstStartedPulling="2026-04-20 21:20:46.878484004 +0000 UTC m=+496.607452581" lastFinishedPulling="2026-04-20 21:20:49.872037145 +0000 UTC m=+499.601005723" observedRunningTime="2026-04-20 21:20:50.473638151 +0000 UTC m=+500.202606750" watchObservedRunningTime="2026-04-20 21:20:50.476808904 +0000 UTC m=+500.205777504" Apr 20 21:20:50.693247 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.693219 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:50.695091 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.695063 2567 status_manager.go:895] "Failed to get status for pod" podUID="270b3af9-da05-4943-9846-e119f2b26ea9" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" is forbidden: User \"system:node:ip-10-0-129-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-57.ec2.internal' and this object" Apr 20 21:20:50.798505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.798392 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kv2v\" (UniqueName: \"kubernetes.io/projected/270b3af9-da05-4943-9846-e119f2b26ea9-kube-api-access-8kv2v\") pod \"270b3af9-da05-4943-9846-e119f2b26ea9\" (UID: \"270b3af9-da05-4943-9846-e119f2b26ea9\") " Apr 20 21:20:50.798505 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.798479 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/270b3af9-da05-4943-9846-e119f2b26ea9-extensions-socket-volume\") pod \"270b3af9-da05-4943-9846-e119f2b26ea9\" (UID: \"270b3af9-da05-4943-9846-e119f2b26ea9\") " Apr 20 21:20:50.798731 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.798652 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/270b3af9-da05-4943-9846-e119f2b26ea9-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "270b3af9-da05-4943-9846-e119f2b26ea9" (UID: "270b3af9-da05-4943-9846-e119f2b26ea9"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:20:50.798846 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.798827 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/270b3af9-da05-4943-9846-e119f2b26ea9-extensions-socket-volume\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:50.800751 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.800724 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270b3af9-da05-4943-9846-e119f2b26ea9-kube-api-access-8kv2v" (OuterVolumeSpecName: "kube-api-access-8kv2v") pod "270b3af9-da05-4943-9846-e119f2b26ea9" (UID: "270b3af9-da05-4943-9846-e119f2b26ea9"). InnerVolumeSpecName "kube-api-access-8kv2v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:20:50.807086 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.807056 2567 status_manager.go:895] "Failed to get status for pod" podUID="270b3af9-da05-4943-9846-e119f2b26ea9" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" is forbidden: User \"system:node:ip-10-0-129-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-57.ec2.internal' and this object" Apr 20 21:20:50.808765 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.808739 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270b3af9-da05-4943-9846-e119f2b26ea9" path="/var/lib/kubelet/pods/270b3af9-da05-4943-9846-e119f2b26ea9/volumes" Apr 20 21:20:50.900105 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:50.900038 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kv2v\" (UniqueName: \"kubernetes.io/projected/270b3af9-da05-4943-9846-e119f2b26ea9-kube-api-access-8kv2v\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:20:51.456857 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:51.456823 2567 generic.go:358] "Generic (PLEG): container finished" podID="270b3af9-da05-4943-9846-e119f2b26ea9" containerID="2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa" exitCode=2 Apr 20 21:20:51.457462 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:51.456878 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" Apr 20 21:20:51.457462 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:51.456918 2567 scope.go:117] "RemoveContainer" containerID="2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa" Apr 20 21:20:51.461226 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:51.461198 2567 status_manager.go:895] "Failed to get status for pod" podUID="270b3af9-da05-4943-9846-e119f2b26ea9" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lh9wn\" is forbidden: User \"system:node:ip-10-0-129-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-57.ec2.internal' and this object" Apr 20 21:20:51.465492 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:51.465475 2567 scope.go:117] "RemoveContainer" containerID="2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa" Apr 20 21:20:51.465741 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:20:51.465722 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa\": container with ID starting with 2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa not found: ID does not exist" containerID="2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa" Apr 20 21:20:51.465800 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:20:51.465753 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa"} err="failed to get container status \"2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa\": rpc error: code = NotFound desc = could not find container \"2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa\": container with ID starting with 2427585ac6979fb046622d55a022992d6eb9a05b5c24ee9f9db37a4b6ecbf1fa not found: ID does not exist" Apr 20 21:21:01.459740 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:01.459711 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:21:13.779864 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:13.779823 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x"] Apr 20 21:21:13.780311 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:13.780088 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" podUID="e957fb0e-4f83-4766-b4e3-3f0328d2cd12" containerName="manager" containerID="cri-o://50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a" gracePeriod=10 Apr 20 21:21:14.020064 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.020036 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:21:14.103514 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.103488 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-extensions-socket-volume\") pod \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\" (UID: \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\") " Apr 20 21:21:14.103687 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.103547 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-599gn\" (UniqueName: \"kubernetes.io/projected/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-kube-api-access-599gn\") pod \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\" (UID: \"e957fb0e-4f83-4766-b4e3-3f0328d2cd12\") " Apr 20 21:21:14.104037 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.104010 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e957fb0e-4f83-4766-b4e3-3f0328d2cd12" (UID: "e957fb0e-4f83-4766-b4e3-3f0328d2cd12"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:21:14.105710 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.105687 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-kube-api-access-599gn" (OuterVolumeSpecName: "kube-api-access-599gn") pod "e957fb0e-4f83-4766-b4e3-3f0328d2cd12" (UID: "e957fb0e-4f83-4766-b4e3-3f0328d2cd12"). InnerVolumeSpecName "kube-api-access-599gn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:21:14.204684 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.204639 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-extensions-socket-volume\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:21:14.204684 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.204675 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-599gn\" (UniqueName: \"kubernetes.io/projected/e957fb0e-4f83-4766-b4e3-3f0328d2cd12-kube-api-access-599gn\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:21:14.544865 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.544780 2567 generic.go:358] "Generic (PLEG): container finished" podID="e957fb0e-4f83-4766-b4e3-3f0328d2cd12" containerID="50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a" exitCode=0 Apr 20 21:21:14.544865 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.544819 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" event={"ID":"e957fb0e-4f83-4766-b4e3-3f0328d2cd12","Type":"ContainerDied","Data":"50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a"} Apr 20 21:21:14.544865 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.544843 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" event={"ID":"e957fb0e-4f83-4766-b4e3-3f0328d2cd12","Type":"ContainerDied","Data":"91f54bd74b00a64ad2e90e9a1c99c3fe8d3cc5515922ac9db0c702d392bb23bc"} Apr 20 21:21:14.544865 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.544844 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x" Apr 20 21:21:14.544865 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.544858 2567 scope.go:117] "RemoveContainer" containerID="50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a" Apr 20 21:21:14.558836 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.558813 2567 scope.go:117] "RemoveContainer" containerID="50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a" Apr 20 21:21:14.559127 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:21:14.559106 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a\": container with ID starting with 50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a not found: ID does not exist" containerID="50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a" Apr 20 21:21:14.559199 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.559138 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a"} err="failed to get container status \"50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a\": rpc error: code = NotFound desc = could not find container \"50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a\": container with ID starting with 50f9ae01fbe8ee76988f79bddc414abb3fa1f7e7bcb9db10dcaddd3a765a4c5a not found: ID does not exist" Apr 20 21:21:14.574324 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.574293 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x"] Apr 20 21:21:14.580478 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.580456 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qvz2x"] Apr 20 21:21:14.807804 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:14.807713 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e957fb0e-4f83-4766-b4e3-3f0328d2cd12" path="/var/lib/kubelet/pods/e957fb0e-4f83-4766-b4e3-3f0328d2cd12/volumes" Apr 20 21:21:30.689509 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.689468 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-74229"] Apr 20 21:21:30.690058 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.689944 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="261cec5b-ea8b-4b66-b5e7-2f33ae892080" containerName="console" Apr 20 21:21:30.690058 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.689961 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="261cec5b-ea8b-4b66-b5e7-2f33ae892080" containerName="console" Apr 20 21:21:30.690058 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.689970 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="270b3af9-da05-4943-9846-e119f2b26ea9" containerName="manager" Apr 20 21:21:30.690058 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.689978 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="270b3af9-da05-4943-9846-e119f2b26ea9" containerName="manager" Apr 20 21:21:30.690058 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.690025 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e957fb0e-4f83-4766-b4e3-3f0328d2cd12" containerName="manager" Apr 20 21:21:30.690058 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.690032 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e957fb0e-4f83-4766-b4e3-3f0328d2cd12" containerName="manager" Apr 20 21:21:30.690463 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.690101 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="261cec5b-ea8b-4b66-b5e7-2f33ae892080" containerName="console" Apr 20 21:21:30.690463 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.690110 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e957fb0e-4f83-4766-b4e3-3f0328d2cd12" containerName="manager" Apr 20 21:21:30.690463 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.690116 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="270b3af9-da05-4943-9846-e119f2b26ea9" containerName="manager" Apr 20 21:21:30.693157 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.693141 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:30.695371 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.695335 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 21:21:30.695371 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.695363 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-tx77r\"" Apr 20 21:21:30.699550 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.699521 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-74229"] Apr 20 21:21:30.736466 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.736434 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-config-file\") pod \"limitador-limitador-7d549b5b-74229\" (UID: \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:30.736632 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.736488 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4vkl\" (UniqueName: \"kubernetes.io/projected/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-kube-api-access-p4vkl\") pod \"limitador-limitador-7d549b5b-74229\" (UID: \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:30.785419 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.785385 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-74229"] Apr 20 21:21:30.837346 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.837314 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-config-file\") pod \"limitador-limitador-7d549b5b-74229\" (UID: \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:30.837497 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.837354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4vkl\" (UniqueName: \"kubernetes.io/projected/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-kube-api-access-p4vkl\") pod \"limitador-limitador-7d549b5b-74229\" (UID: \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:30.837883 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.837861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-config-file\") pod \"limitador-limitador-7d549b5b-74229\" (UID: \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:30.845869 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:30.845841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4vkl\" (UniqueName: \"kubernetes.io/projected/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-kube-api-access-p4vkl\") pod \"limitador-limitador-7d549b5b-74229\" (UID: \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\") " pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:31.000599 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.000516 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:21:31.004678 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.004651 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:31.005473 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.005457 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:31.014466 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.014436 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:21:31.039746 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.039709 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qk8\" (UniqueName: \"kubernetes.io/projected/bff4826b-070b-4dbc-898a-13349b2c56f9-kube-api-access-m6qk8\") pod \"limitador-limitador-78c99df468-q6v89\" (UID: \"bff4826b-070b-4dbc-898a-13349b2c56f9\") " pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:31.039895 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.039767 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bff4826b-070b-4dbc-898a-13349b2c56f9-config-file\") pod \"limitador-limitador-78c99df468-q6v89\" (UID: \"bff4826b-070b-4dbc-898a-13349b2c56f9\") " pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:31.041653 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.041631 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:21:31.138742 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.138709 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-74229"] Apr 20 21:21:31.140823 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.140783 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qk8\" (UniqueName: \"kubernetes.io/projected/bff4826b-070b-4dbc-898a-13349b2c56f9-kube-api-access-m6qk8\") pod \"limitador-limitador-78c99df468-q6v89\" (UID: \"bff4826b-070b-4dbc-898a-13349b2c56f9\") " pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:31.140958 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.140828 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bff4826b-070b-4dbc-898a-13349b2c56f9-config-file\") pod \"limitador-limitador-78c99df468-q6v89\" (UID: \"bff4826b-070b-4dbc-898a-13349b2c56f9\") " pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:31.141453 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.141434 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bff4826b-070b-4dbc-898a-13349b2c56f9-config-file\") pod \"limitador-limitador-78c99df468-q6v89\" (UID: \"bff4826b-070b-4dbc-898a-13349b2c56f9\") " pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:31.142169 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:21:31.142145 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234d0e1b_b87f_4511_a1b3_ce3baf0ee9f0.slice/crio-f3a64a56f17a74672378fbc35b7ef9888bac79198f7ff3c91cbbf22b9138d34b WatchSource:0}: Error finding container f3a64a56f17a74672378fbc35b7ef9888bac79198f7ff3c91cbbf22b9138d34b: Status 404 returned error can't find the container with id f3a64a56f17a74672378fbc35b7ef9888bac79198f7ff3c91cbbf22b9138d34b Apr 20 21:21:31.149584 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.149560 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qk8\" (UniqueName: \"kubernetes.io/projected/bff4826b-070b-4dbc-898a-13349b2c56f9-kube-api-access-m6qk8\") pod \"limitador-limitador-78c99df468-q6v89\" (UID: \"bff4826b-070b-4dbc-898a-13349b2c56f9\") " pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:31.334573 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.334488 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:31.455857 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.455835 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:21:31.457812 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:21:31.457785 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff4826b_070b_4dbc_898a_13349b2c56f9.slice/crio-6094064227d243af3d11d3087d450433d31c9e3d613f52dcedf835dde0cefc12 WatchSource:0}: Error finding container 6094064227d243af3d11d3087d450433d31c9e3d613f52dcedf835dde0cefc12: Status 404 returned error can't find the container with id 6094064227d243af3d11d3087d450433d31c9e3d613f52dcedf835dde0cefc12 Apr 20 21:21:31.611564 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.611528 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" event={"ID":"bff4826b-070b-4dbc-898a-13349b2c56f9","Type":"ContainerStarted","Data":"6094064227d243af3d11d3087d450433d31c9e3d613f52dcedf835dde0cefc12"} Apr 20 21:21:31.612489 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:31.612462 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" event={"ID":"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0","Type":"ContainerStarted","Data":"f3a64a56f17a74672378fbc35b7ef9888bac79198f7ff3c91cbbf22b9138d34b"} Apr 20 21:21:34.626322 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:34.626283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" event={"ID":"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0","Type":"ContainerStarted","Data":"b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1"} Apr 20 21:21:34.626706 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:34.626331 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:34.627575 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:34.627554 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" event={"ID":"bff4826b-070b-4dbc-898a-13349b2c56f9","Type":"ContainerStarted","Data":"eff73a95d27bde8e124e98a0db3cce4a4f407de4591178efd637fd3fbb79ea7c"} Apr 20 21:21:34.627703 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:34.627692 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:34.641305 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:34.641232 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" podStartSLOduration=1.496815976 podStartE2EDuration="4.641217072s" podCreationTimestamp="2026-04-20 21:21:30 +0000 UTC" firstStartedPulling="2026-04-20 21:21:31.144426324 +0000 UTC m=+540.873394901" lastFinishedPulling="2026-04-20 21:21:34.288827409 +0000 UTC m=+544.017795997" observedRunningTime="2026-04-20 21:21:34.64028126 +0000 UTC m=+544.369249859" watchObservedRunningTime="2026-04-20 21:21:34.641217072 +0000 UTC m=+544.370185677" Apr 20 21:21:34.657879 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:34.657842 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" podStartSLOduration=1.817738822 podStartE2EDuration="4.657831957s" podCreationTimestamp="2026-04-20 21:21:30 +0000 UTC" firstStartedPulling="2026-04-20 21:21:31.45994693 +0000 UTC m=+541.188915507" lastFinishedPulling="2026-04-20 21:21:34.300040064 +0000 UTC m=+544.029008642" observedRunningTime="2026-04-20 21:21:34.656221963 +0000 UTC m=+544.385190559" watchObservedRunningTime="2026-04-20 21:21:34.657831957 +0000 UTC m=+544.386800555" Apr 20 21:21:45.631931 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:45.631905 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-q6v89" Apr 20 21:21:45.632440 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:45.631951 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:45.685245 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:45.685214 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-74229"] Apr 20 21:21:45.685437 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:45.685403 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" podUID="234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0" containerName="limitador" containerID="cri-o://b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1" gracePeriod=30 Apr 20 21:21:46.222029 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.222006 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:46.371998 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.371958 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-config-file\") pod \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\" (UID: \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\") " Apr 20 21:21:46.372154 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.372007 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4vkl\" (UniqueName: \"kubernetes.io/projected/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-kube-api-access-p4vkl\") pod \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\" (UID: \"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0\") " Apr 20 21:21:46.372344 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.372318 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-config-file" (OuterVolumeSpecName: "config-file") pod "234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0" (UID: "234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:21:46.374180 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.374146 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-kube-api-access-p4vkl" (OuterVolumeSpecName: "kube-api-access-p4vkl") pod "234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0" (UID: "234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0"). InnerVolumeSpecName "kube-api-access-p4vkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:21:46.472815 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.472791 2567 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-config-file\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:21:46.472815 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.472813 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4vkl\" (UniqueName: \"kubernetes.io/projected/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0-kube-api-access-p4vkl\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:21:46.673763 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.673684 2567 generic.go:358] "Generic (PLEG): container finished" podID="234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0" containerID="b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1" exitCode=0 Apr 20 21:21:46.673763 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.673751 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" Apr 20 21:21:46.674186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.673767 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" event={"ID":"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0","Type":"ContainerDied","Data":"b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1"} Apr 20 21:21:46.674186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.673800 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-74229" event={"ID":"234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0","Type":"ContainerDied","Data":"f3a64a56f17a74672378fbc35b7ef9888bac79198f7ff3c91cbbf22b9138d34b"} Apr 20 21:21:46.674186 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.673815 2567 scope.go:117] "RemoveContainer" containerID="b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1" Apr 20 21:21:46.682458 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.682441 2567 scope.go:117] "RemoveContainer" containerID="b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1" Apr 20 21:21:46.682710 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:21:46.682688 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1\": container with ID starting with b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1 not found: ID does not exist" containerID="b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1" Apr 20 21:21:46.682812 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.682716 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1"} err="failed to get container status \"b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1\": rpc error: code = NotFound desc = could not find container \"b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1\": container with ID starting with b894d365479548fb650ebcb1e430e26583cf6328d83e7bacea57b5b914be19a1 not found: ID does not exist" Apr 20 21:21:46.694576 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.694552 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-74229"] Apr 20 21:21:46.697645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.697624 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-74229"] Apr 20 21:21:46.807507 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:21:46.807479 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0" path="/var/lib/kubelet/pods/234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0/volumes" Apr 20 21:22:03.193140 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.193100 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q568r"] Apr 20 21:22:03.193539 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.193522 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0" containerName="limitador" Apr 20 21:22:03.193585 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.193541 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0" containerName="limitador" Apr 20 21:22:03.193623 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.193606 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="234d0e1b-b87f-4511-a1b3-ce3baf0ee9f0" containerName="limitador" Apr 20 21:22:03.199461 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.199439 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" Apr 20 21:22:03.202527 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.202508 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-prkzx\"" Apr 20 21:22:03.208936 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.208912 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q568r"] Apr 20 21:22:03.315742 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.315701 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv79f\" (UniqueName: \"kubernetes.io/projected/8d8f3ac4-b625-4ce3-bac7-b976d60d110a-kube-api-access-xv79f\") pod \"maas-controller-6d4c8f55f9-q568r\" (UID: \"8d8f3ac4-b625-4ce3-bac7-b976d60d110a\") " pod="opendatahub/maas-controller-6d4c8f55f9-q568r" Apr 20 21:22:03.417175 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.417133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv79f\" (UniqueName: \"kubernetes.io/projected/8d8f3ac4-b625-4ce3-bac7-b976d60d110a-kube-api-access-xv79f\") pod \"maas-controller-6d4c8f55f9-q568r\" (UID: \"8d8f3ac4-b625-4ce3-bac7-b976d60d110a\") " pod="opendatahub/maas-controller-6d4c8f55f9-q568r" Apr 20 21:22:03.425873 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.425840 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv79f\" (UniqueName: \"kubernetes.io/projected/8d8f3ac4-b625-4ce3-bac7-b976d60d110a-kube-api-access-xv79f\") pod \"maas-controller-6d4c8f55f9-q568r\" (UID: \"8d8f3ac4-b625-4ce3-bac7-b976d60d110a\") " pod="opendatahub/maas-controller-6d4c8f55f9-q568r" Apr 20 21:22:03.495321 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.495235 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q568r"] Apr 20 21:22:03.495519 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.495506 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" Apr 20 21:22:03.624512 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.624244 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q568r"] Apr 20 21:22:03.629752 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:22:03.629722 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d8f3ac4_b625_4ce3_bac7_b976d60d110a.slice/crio-a709115b2cea34dc3ca6b8ab97482f51efe07f4ff064047567269802309e8b01 WatchSource:0}: Error finding container a709115b2cea34dc3ca6b8ab97482f51efe07f4ff064047567269802309e8b01: Status 404 returned error can't find the container with id a709115b2cea34dc3ca6b8ab97482f51efe07f4ff064047567269802309e8b01 Apr 20 21:22:03.735517 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:03.735479 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" event={"ID":"8d8f3ac4-b625-4ce3-bac7-b976d60d110a","Type":"ContainerStarted","Data":"a709115b2cea34dc3ca6b8ab97482f51efe07f4ff064047567269802309e8b01"} Apr 20 21:22:06.756840 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:06.756804 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" event={"ID":"8d8f3ac4-b625-4ce3-bac7-b976d60d110a","Type":"ContainerStarted","Data":"abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670"} Apr 20 21:22:06.757367 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:06.756854 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" podUID="8d8f3ac4-b625-4ce3-bac7-b976d60d110a" containerName="manager" containerID="cri-o://abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670" gracePeriod=10 Apr 20 21:22:06.757367 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:06.756905 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" Apr 20 21:22:06.773020 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:06.772921 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" podStartSLOduration=1.6277419929999999 podStartE2EDuration="3.772909853s" podCreationTimestamp="2026-04-20 21:22:03 +0000 UTC" firstStartedPulling="2026-04-20 21:22:03.630960751 +0000 UTC m=+573.359929329" lastFinishedPulling="2026-04-20 21:22:05.776128599 +0000 UTC m=+575.505097189" observedRunningTime="2026-04-20 21:22:06.772298807 +0000 UTC m=+576.501267403" watchObservedRunningTime="2026-04-20 21:22:06.772909853 +0000 UTC m=+576.501878461" Apr 20 21:22:06.984646 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:06.984623 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" Apr 20 21:22:07.149836 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.149811 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv79f\" (UniqueName: \"kubernetes.io/projected/8d8f3ac4-b625-4ce3-bac7-b976d60d110a-kube-api-access-xv79f\") pod \"8d8f3ac4-b625-4ce3-bac7-b976d60d110a\" (UID: \"8d8f3ac4-b625-4ce3-bac7-b976d60d110a\") " Apr 20 21:22:07.152039 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.152018 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8f3ac4-b625-4ce3-bac7-b976d60d110a-kube-api-access-xv79f" (OuterVolumeSpecName: "kube-api-access-xv79f") pod "8d8f3ac4-b625-4ce3-bac7-b976d60d110a" (UID: "8d8f3ac4-b625-4ce3-bac7-b976d60d110a"). InnerVolumeSpecName "kube-api-access-xv79f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:22:07.250995 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.250958 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xv79f\" (UniqueName: \"kubernetes.io/projected/8d8f3ac4-b625-4ce3-bac7-b976d60d110a-kube-api-access-xv79f\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:22:07.761046 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.761007 2567 generic.go:358] "Generic (PLEG): container finished" podID="8d8f3ac4-b625-4ce3-bac7-b976d60d110a" containerID="abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670" exitCode=0 Apr 20 21:22:07.761471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.761068 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" Apr 20 21:22:07.761471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.761086 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" event={"ID":"8d8f3ac4-b625-4ce3-bac7-b976d60d110a","Type":"ContainerDied","Data":"abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670"} Apr 20 21:22:07.761471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.761132 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-q568r" event={"ID":"8d8f3ac4-b625-4ce3-bac7-b976d60d110a","Type":"ContainerDied","Data":"a709115b2cea34dc3ca6b8ab97482f51efe07f4ff064047567269802309e8b01"} Apr 20 21:22:07.761471 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.761157 2567 scope.go:117] "RemoveContainer" containerID="abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670" Apr 20 21:22:07.770586 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.770569 2567 scope.go:117] "RemoveContainer" containerID="abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670" Apr 20 21:22:07.770852 ip-10-0-129-57 kubenswrapper[2567]: E0420 21:22:07.770835 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670\": container with ID starting with abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670 not found: ID does not exist" containerID="abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670" Apr 20 21:22:07.770892 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.770859 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670"} err="failed to get container status \"abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670\": rpc error: code = NotFound desc = could not find container \"abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670\": container with ID starting with abaaad43d58249a3722ed4b6badf897fa7c5ee62109c9bcc60ff900d00fb6670 not found: ID does not exist" Apr 20 21:22:07.782695 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.782670 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q568r"] Apr 20 21:22:07.786472 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:07.786452 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-q568r"] Apr 20 21:22:08.808278 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:08.808246 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8f3ac4-b625-4ce3-bac7-b976d60d110a" path="/var/lib/kubelet/pods/8d8f3ac4-b625-4ce3-bac7-b976d60d110a/volumes" Apr 20 21:22:18.319007 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.318947 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-mvmxd"] Apr 20 21:22:18.319397 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.319368 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d8f3ac4-b625-4ce3-bac7-b976d60d110a" containerName="manager" Apr 20 21:22:18.319397 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.319383 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f3ac4-b625-4ce3-bac7-b976d60d110a" containerName="manager" Apr 20 21:22:18.319468 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.319441 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d8f3ac4-b625-4ce3-bac7-b976d60d110a" containerName="manager" Apr 20 21:22:18.321961 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.321944 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:22:18.324100 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.324079 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-prkzx\"" Apr 20 21:22:18.330389 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.330369 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-mvmxd"] Apr 20 21:22:18.432484 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.432452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2gf2\" (UniqueName: \"kubernetes.io/projected/b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c-kube-api-access-r2gf2\") pod \"maas-controller-b7b7fc65d-mvmxd\" (UID: \"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c\") " pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:22:18.532968 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.532934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2gf2\" (UniqueName: \"kubernetes.io/projected/b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c-kube-api-access-r2gf2\") pod \"maas-controller-b7b7fc65d-mvmxd\" (UID: \"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c\") " pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:22:18.540772 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.540743 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2gf2\" (UniqueName: \"kubernetes.io/projected/b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c-kube-api-access-r2gf2\") pod \"maas-controller-b7b7fc65d-mvmxd\" (UID: \"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c\") " pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:22:18.633534 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.633506 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:22:18.751558 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.751532 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-mvmxd"] Apr 20 21:22:18.753672 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:22:18.753642 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb880f2ee_83e2_41f3_a8e5_3d5c7a754c5c.slice/crio-bd0f4b332d00217ce26ba61dd45f752a25539385680d2af029dd54a26b2ee13f WatchSource:0}: Error finding container bd0f4b332d00217ce26ba61dd45f752a25539385680d2af029dd54a26b2ee13f: Status 404 returned error can't find the container with id bd0f4b332d00217ce26ba61dd45f752a25539385680d2af029dd54a26b2ee13f Apr 20 21:22:18.801142 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:18.801117 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" event={"ID":"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c","Type":"ContainerStarted","Data":"bd0f4b332d00217ce26ba61dd45f752a25539385680d2af029dd54a26b2ee13f"} Apr 20 21:22:20.809640 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:20.809606 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" event={"ID":"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c","Type":"ContainerStarted","Data":"3be19be96f93fe7552747474f21493b1ff5cd41e11a5fdadfd91426ec5f0b108"} Apr 20 21:22:20.810033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:20.809753 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:22:20.844615 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:20.844564 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" podStartSLOduration=1.457596621 podStartE2EDuration="2.844547601s" podCreationTimestamp="2026-04-20 21:22:18 +0000 UTC" firstStartedPulling="2026-04-20 21:22:18.754865728 +0000 UTC m=+588.483834307" lastFinishedPulling="2026-04-20 21:22:20.141816708 +0000 UTC m=+589.870785287" observedRunningTime="2026-04-20 21:22:20.842156233 +0000 UTC m=+590.571124843" watchObservedRunningTime="2026-04-20 21:22:20.844547601 +0000 UTC m=+590.573516200" Apr 20 21:22:26.827645 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:26.827619 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:22:31.819225 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:31.819193 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:22:49.735548 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.735506 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-667bfbcd57-qdzlz"] Apr 20 21:22:49.739029 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.739012 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:49.741980 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.741949 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 21:22:49.741980 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.741975 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 21:22:49.742149 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.742060 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-nd7zd\"" Apr 20 21:22:49.748468 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.748448 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-667bfbcd57-qdzlz"] Apr 20 21:22:49.896789 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.896750 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1d959f44-a5fe-4e59-8922-88e4a1b5e167-maas-api-tls\") pod \"maas-api-667bfbcd57-qdzlz\" (UID: \"1d959f44-a5fe-4e59-8922-88e4a1b5e167\") " pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:49.896955 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.896858 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6qqc\" (UniqueName: \"kubernetes.io/projected/1d959f44-a5fe-4e59-8922-88e4a1b5e167-kube-api-access-x6qqc\") pod \"maas-api-667bfbcd57-qdzlz\" (UID: \"1d959f44-a5fe-4e59-8922-88e4a1b5e167\") " pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:49.998169 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.998090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1d959f44-a5fe-4e59-8922-88e4a1b5e167-maas-api-tls\") pod \"maas-api-667bfbcd57-qdzlz\" (UID: \"1d959f44-a5fe-4e59-8922-88e4a1b5e167\") " pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:49.998298 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:49.998177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6qqc\" (UniqueName: \"kubernetes.io/projected/1d959f44-a5fe-4e59-8922-88e4a1b5e167-kube-api-access-x6qqc\") pod \"maas-api-667bfbcd57-qdzlz\" (UID: \"1d959f44-a5fe-4e59-8922-88e4a1b5e167\") " pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:50.000762 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:50.000737 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1d959f44-a5fe-4e59-8922-88e4a1b5e167-maas-api-tls\") pod \"maas-api-667bfbcd57-qdzlz\" (UID: \"1d959f44-a5fe-4e59-8922-88e4a1b5e167\") " pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:50.005361 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:50.005341 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6qqc\" (UniqueName: \"kubernetes.io/projected/1d959f44-a5fe-4e59-8922-88e4a1b5e167-kube-api-access-x6qqc\") pod \"maas-api-667bfbcd57-qdzlz\" (UID: \"1d959f44-a5fe-4e59-8922-88e4a1b5e167\") " pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:50.051359 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:50.051301 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:50.172858 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:50.172834 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-667bfbcd57-qdzlz"] Apr 20 21:22:50.174749 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:22:50.174722 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d959f44_a5fe_4e59_8922_88e4a1b5e167.slice/crio-e92accb8539dc41eb04d46abc100b4c86da8fa78bacb952286d1ce840f66b170 WatchSource:0}: Error finding container e92accb8539dc41eb04d46abc100b4c86da8fa78bacb952286d1ce840f66b170: Status 404 returned error can't find the container with id e92accb8539dc41eb04d46abc100b4c86da8fa78bacb952286d1ce840f66b170 Apr 20 21:22:50.176069 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:50.176050 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:22:50.917276 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:50.917239 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-667bfbcd57-qdzlz" event={"ID":"1d959f44-a5fe-4e59-8922-88e4a1b5e167","Type":"ContainerStarted","Data":"e92accb8539dc41eb04d46abc100b4c86da8fa78bacb952286d1ce840f66b170"} Apr 20 21:22:52.925060 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:52.925019 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-667bfbcd57-qdzlz" event={"ID":"1d959f44-a5fe-4e59-8922-88e4a1b5e167","Type":"ContainerStarted","Data":"229d7471a9838ece496ea16e5da477fd38df31267ca311215eb0d186518617ad"} Apr 20 21:22:52.925456 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:52.925077 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:22:52.941047 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:52.941001 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-667bfbcd57-qdzlz" podStartSLOduration=2.170221445 podStartE2EDuration="3.940973383s" podCreationTimestamp="2026-04-20 21:22:49 +0000 UTC" firstStartedPulling="2026-04-20 21:22:50.176190902 +0000 UTC m=+619.905159479" lastFinishedPulling="2026-04-20 21:22:51.94694283 +0000 UTC m=+621.675911417" observedRunningTime="2026-04-20 21:22:52.938683517 +0000 UTC m=+622.667652128" watchObservedRunningTime="2026-04-20 21:22:52.940973383 +0000 UTC m=+622.669941982" Apr 20 21:22:56.637811 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.637729 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds"] Apr 20 21:22:56.642050 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.642031 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.644176 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.644156 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 21:22:56.644279 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.644226 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 21:22:56.645033 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.645018 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-rl5wt\"" Apr 20 21:22:56.645103 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.645077 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 21:22:56.651480 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.651458 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds"] Apr 20 21:22:56.748747 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.748721 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.748888 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.748762 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.748888 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.748786 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.748888 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.748864 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.749018 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.748906 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.749018 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.748934 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdrlx\" (UniqueName: \"kubernetes.io/projected/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-kube-api-access-sdrlx\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.849551 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.849525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.849699 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.849565 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.849699 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.849603 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdrlx\" (UniqueName: \"kubernetes.io/projected/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-kube-api-access-sdrlx\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.849699 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.849669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.849861 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.849707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.849861 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.849741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.849962 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.849927 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.850087 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.850066 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.850192 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.850103 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.852117 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.852094 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.852306 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.852288 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.856314 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.856293 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdrlx\" (UniqueName: \"kubernetes.io/projected/bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d-kube-api-access-sdrlx\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hvhds\" (UID: \"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:56.953661 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:56.953578 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:22:57.082459 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:57.082433 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds"] Apr 20 21:22:57.084031 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:22:57.084002 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4041d1_6f6c_4b74_9d97_7d1e86d03e0d.slice/crio-479122fa85ae094c4f7a0a7b70cb224f768e263b27e337e07acccf0d20936712 WatchSource:0}: Error finding container 479122fa85ae094c4f7a0a7b70cb224f768e263b27e337e07acccf0d20936712: Status 404 returned error can't find the container with id 479122fa85ae094c4f7a0a7b70cb224f768e263b27e337e07acccf0d20936712 Apr 20 21:22:57.097020 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:57.090446 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:22:57.943644 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:57.943611 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" event={"ID":"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d","Type":"ContainerStarted","Data":"479122fa85ae094c4f7a0a7b70cb224f768e263b27e337e07acccf0d20936712"} Apr 20 21:22:58.935028 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:22:58.934977 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-667bfbcd57-qdzlz" Apr 20 21:23:03.969814 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:03.969773 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" event={"ID":"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d","Type":"ContainerStarted","Data":"5688412c8f87cf8229f87f72cb590cbff062e77b67454fb648e2c71ec785c624"} Apr 20 21:23:08.990150 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:08.990071 2567 generic.go:358] "Generic (PLEG): container finished" podID="bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d" containerID="5688412c8f87cf8229f87f72cb590cbff062e77b67454fb648e2c71ec785c624" exitCode=0 Apr 20 21:23:08.990623 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:08.990160 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" event={"ID":"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d","Type":"ContainerDied","Data":"5688412c8f87cf8229f87f72cb590cbff062e77b67454fb648e2c71ec785c624"} Apr 20 21:23:08.993693 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:08.993666 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:23:11.006125 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:11.006089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" event={"ID":"bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d","Type":"ContainerStarted","Data":"067fe330f2d3113df7cb7f03ebb4cabcd339c8bde6d9dc66fb4a55979d501a79"} Apr 20 21:23:11.006558 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:11.006315 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:23:11.024806 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:11.024762 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" podStartSLOduration=1.97628145 podStartE2EDuration="15.02475015s" podCreationTimestamp="2026-04-20 21:22:56 +0000 UTC" firstStartedPulling="2026-04-20 21:22:57.085733729 +0000 UTC m=+626.814702306" lastFinishedPulling="2026-04-20 21:23:10.134202427 +0000 UTC m=+639.863171006" observedRunningTime="2026-04-20 21:23:11.022046976 +0000 UTC m=+640.751015574" watchObservedRunningTime="2026-04-20 21:23:11.02475015 +0000 UTC m=+640.753718749" Apr 20 21:23:20.338629 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.338592 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm"] Apr 20 21:23:20.341493 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.341471 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.343650 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.343630 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 21:23:20.352424 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.352404 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm"] Apr 20 21:23:20.356380 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.356353 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.356479 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.356390 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq8zw\" (UniqueName: \"kubernetes.io/projected/5d45e559-2ba8-4add-8d98-60760e253bf5-kube-api-access-qq8zw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.356531 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.356468 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.356568 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.356526 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.356635 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.356620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.356707 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.356689 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5d45e559-2ba8-4add-8d98-60760e253bf5-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457321 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457284 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5d45e559-2ba8-4add-8d98-60760e253bf5-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457466 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457335 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457466 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457366 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq8zw\" (UniqueName: \"kubernetes.io/projected/5d45e559-2ba8-4add-8d98-60760e253bf5-kube-api-access-qq8zw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457466 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457466 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457689 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457515 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457845 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457822 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457939 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457864 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.457939 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.457902 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.459716 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.459698 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5d45e559-2ba8-4add-8d98-60760e253bf5-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.460098 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.460077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5d45e559-2ba8-4add-8d98-60760e253bf5-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.463961 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.463941 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq8zw\" (UniqueName: \"kubernetes.io/projected/5d45e559-2ba8-4add-8d98-60760e253bf5-kube-api-access-qq8zw\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm\" (UID: \"5d45e559-2ba8-4add-8d98-60760e253bf5\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.653983 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.653902 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:20.781952 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.781927 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm"] Apr 20 21:23:20.783445 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:23:20.783419 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d45e559_2ba8_4add_8d98_60760e253bf5.slice/crio-f1ecf6428b2fbd94728200870383ce249057c2a4d087d1bb95b7cce1bb295804 WatchSource:0}: Error finding container f1ecf6428b2fbd94728200870383ce249057c2a4d087d1bb95b7cce1bb295804: Status 404 returned error can't find the container with id f1ecf6428b2fbd94728200870383ce249057c2a4d087d1bb95b7cce1bb295804 Apr 20 21:23:20.884969 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:20.884936 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:23:21.042095 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:21.042015 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" event={"ID":"5d45e559-2ba8-4add-8d98-60760e253bf5","Type":"ContainerStarted","Data":"bdb4cab71c4c77d8bf4ccaa5772b1fee5f0ca5208cbf5bf6a55bc980c4bdb705"} Apr 20 21:23:21.042095 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:21.042050 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" event={"ID":"5d45e559-2ba8-4add-8d98-60760e253bf5","Type":"ContainerStarted","Data":"f1ecf6428b2fbd94728200870383ce249057c2a4d087d1bb95b7cce1bb295804"} Apr 20 21:23:22.028936 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:22.028909 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hvhds" Apr 20 21:23:24.382526 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:24.382487 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:23:27.069134 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:27.069098 2567 generic.go:358] "Generic (PLEG): container finished" podID="5d45e559-2ba8-4add-8d98-60760e253bf5" containerID="bdb4cab71c4c77d8bf4ccaa5772b1fee5f0ca5208cbf5bf6a55bc980c4bdb705" exitCode=0 Apr 20 21:23:27.069552 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:27.069173 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" event={"ID":"5d45e559-2ba8-4add-8d98-60760e253bf5","Type":"ContainerDied","Data":"bdb4cab71c4c77d8bf4ccaa5772b1fee5f0ca5208cbf5bf6a55bc980c4bdb705"} Apr 20 21:23:28.074591 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:28.074557 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" event={"ID":"5d45e559-2ba8-4add-8d98-60760e253bf5","Type":"ContainerStarted","Data":"af089853abf87ba22997003202cdd0aa15ae9d61b983510d2d2329141626d09a"} Apr 20 21:23:28.075011 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:28.074768 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:28.093227 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:28.093174 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" podStartSLOduration=7.890378941 podStartE2EDuration="8.093158645s" podCreationTimestamp="2026-04-20 21:23:20 +0000 UTC" firstStartedPulling="2026-04-20 21:23:27.069760888 +0000 UTC m=+656.798729465" lastFinishedPulling="2026-04-20 21:23:27.272540591 +0000 UTC m=+657.001509169" observedRunningTime="2026-04-20 21:23:28.089820395 +0000 UTC m=+657.818789008" watchObservedRunningTime="2026-04-20 21:23:28.093158645 +0000 UTC m=+657.822127244" Apr 20 21:23:39.091593 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:39.091556 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm" Apr 20 21:23:44.087764 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:44.087724 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:23:49.687156 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:23:49.687117 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:24:47.377647 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:24:47.377611 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:24:51.780410 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:24:51.780374 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:24:57.979614 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:24:57.979578 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:25:08.284705 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:08.284665 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:25:16.989898 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:16.989866 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:25:28.386148 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:28.386102 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:25:37.581308 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:37.581277 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:25:47.386375 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:47.386337 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:25:54.444591 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:54.444496 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-mvmxd"] Apr 20 21:25:54.445157 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:54.444728 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" podUID="b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c" containerName="manager" containerID="cri-o://3be19be96f93fe7552747474f21493b1ff5cd41e11a5fdadfd91426ec5f0b108" gracePeriod=10 Apr 20 21:25:54.620268 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:54.620239 2567 generic.go:358] "Generic (PLEG): container finished" podID="b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c" containerID="3be19be96f93fe7552747474f21493b1ff5cd41e11a5fdadfd91426ec5f0b108" exitCode=0 Apr 20 21:25:54.620386 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:54.620288 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" event={"ID":"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c","Type":"ContainerDied","Data":"3be19be96f93fe7552747474f21493b1ff5cd41e11a5fdadfd91426ec5f0b108"} Apr 20 21:25:54.678130 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:54.678105 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:25:54.707234 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:54.707177 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2gf2\" (UniqueName: \"kubernetes.io/projected/b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c-kube-api-access-r2gf2\") pod \"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c\" (UID: \"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c\") " Apr 20 21:25:54.709344 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:54.709315 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c-kube-api-access-r2gf2" (OuterVolumeSpecName: "kube-api-access-r2gf2") pod "b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c" (UID: "b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c"). InnerVolumeSpecName "kube-api-access-r2gf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:25:54.807921 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:54.807895 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r2gf2\" (UniqueName: \"kubernetes.io/projected/b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c-kube-api-access-r2gf2\") on node \"ip-10-0-129-57.ec2.internal\" DevicePath \"\"" Apr 20 21:25:55.625322 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:55.625287 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" Apr 20 21:25:55.625748 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:55.625283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b7b7fc65d-mvmxd" event={"ID":"b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c","Type":"ContainerDied","Data":"bd0f4b332d00217ce26ba61dd45f752a25539385680d2af029dd54a26b2ee13f"} Apr 20 21:25:55.625748 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:55.625410 2567 scope.go:117] "RemoveContainer" containerID="3be19be96f93fe7552747474f21493b1ff5cd41e11a5fdadfd91426ec5f0b108" Apr 20 21:25:55.642224 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:55.642197 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-mvmxd"] Apr 20 21:25:55.645650 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:55.645625 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-mvmxd"] Apr 20 21:25:56.807440 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:25:56.807400 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c" path="/var/lib/kubelet/pods/b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c/volumes" Apr 20 21:26:50.283892 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:26:50.283857 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:27:05.682258 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:27:05.682222 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:27:44.088566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:27:44.088520 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:28:00.486304 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:28:00.486267 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:28:15.678736 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:28:15.678695 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:28:32.085027 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:28:32.084978 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:29:24.284143 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:29:24.284107 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:29:32.687590 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:29:32.687557 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:29:49.287264 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:29:49.283059 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:29:57.979622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:29:57.979583 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:30:14.778755 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:30:14.778717 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:30:24.183215 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:30:24.183128 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:30:56.089026 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:30:56.088969 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:31:04.978824 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:31:04.978776 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:31:13.581079 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:31:13.581043 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:31:21.887533 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:31:21.887498 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:31:30.176277 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:31:30.176238 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:31:46.679442 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:31:46.679407 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:31:57.679923 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:31:57.679840 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:32:43.986297 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:32:43.986255 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:32:52.480460 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:32:52.480421 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:33:02.182060 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:33:02.182021 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:33:10.390437 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:33:10.390397 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:33:19.084695 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:33:19.084660 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:33:28.282171 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:33:28.282097 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:33:37.390294 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:33:37.390255 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:33:45.279246 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:33:45.279207 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:33:53.883369 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:33:53.883331 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:34:02.482586 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:34:02.482551 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:34:12.387143 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:34:12.387101 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:34:20.801441 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:34:20.801393 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:34:29.287572 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:34:29.287537 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:34:38.584299 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:34:38.584249 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:34:47.481002 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:34:47.480948 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:34:54.686028 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:34:54.685929 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:35:04.286353 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:35:04.286314 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:35:12.186976 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:35:12.186939 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:37:29.385187 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:37:29.385153 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:37:34.673128 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:37:34.673096 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:38:00.586359 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:38:00.586332 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:38:04.982972 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:38:04.982942 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:38:15.378145 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:38:15.378109 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:38:25.185223 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:38:25.185191 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:38:34.187792 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:38:34.187750 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:38:44.486121 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:38:44.486087 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:38:53.387590 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:38:53.387559 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:39:03.988427 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:39:03.988391 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:39:13.011139 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:39:13.011095 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:39:23.684404 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:39:23.684324 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:39:32.483539 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:39:32.483501 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:40:06.184525 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:40:06.184491 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:40:48.604426 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:40:48.604393 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:40:56.208090 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:40:56.208012 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:41:04.790588 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:41:04.790556 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:41:14.283976 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:41:14.283939 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:41:23.184205 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:41:23.184171 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:41:35.482500 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:41:35.482467 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:41:44.384500 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:41:44.384460 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:41:52.086028 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:41:52.085979 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:42:00.689523 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:42:00.689489 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:42:08.979402 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:42:08.979366 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:42:17.785537 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:42:17.785499 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:42:30.354048 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:42:30.354013 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:42:47.587470 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:42:47.587421 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:42:56.886901 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:42:56.886872 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:43:05.582444 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:43:05.582405 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:43:12.908079 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:43:12.908041 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:43:30.483836 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:43:30.483798 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:43:39.078666 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:43:39.078630 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:43:47.283396 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:43:47.283363 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:43:56.375489 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:43:56.375413 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:44:04.976256 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:44:04.976227 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:44:13.284921 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:44:13.284870 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:44:22.289547 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:44:22.289512 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:44:33.492132 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:44:33.492096 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:44:42.278892 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:44:42.278861 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:44:54.285628 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:44:54.285590 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:45:03.086074 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:45:03.086038 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:45:10.685284 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:45:10.685249 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:45:18.787859 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:45:18.787829 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:45:27.583006 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:45:27.582921 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:45:43.682655 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:45:43.682623 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:45:52.778831 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:45:52.778776 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:46:01.692206 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:01.692166 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:46:09.182737 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:09.182706 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:46:33.783516 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:33.783478 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:46:46.383133 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:46.383098 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:46:50.874407 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:50.874327 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-q6v89"] Apr 20 21:46:51.383487 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:51.383459 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-cxmzf_6ec111a7-2378-4ead-875a-06574d248b03/manager/0.log" Apr 20 21:46:51.505095 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:51.505073 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-667bfbcd57-qdzlz_1d959f44-a5fe-4e59-8922-88e4a1b5e167/maas-api/0.log" Apr 20 21:46:51.728449 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:51.728387 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-kp466_f82b7fb7-1330-451e-8bb2-65948b52b19c/manager/2.log" Apr 20 21:46:52.097425 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:52.097396 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-85fc55dd88-wpc79_640c891e-2faa-4e30-bdd1-e531b6ec685f/manager/0.log" Apr 20 21:46:53.381171 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:53.381143 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw_837d9f42-ca4b-414b-b8f2-1670d9c6f266/pull/0.log" Apr 20 21:46:53.387409 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:53.387389 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw_837d9f42-ca4b-414b-b8f2-1670d9c6f266/extract/0.log" Apr 20 21:46:53.392677 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:53.392661 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw_837d9f42-ca4b-414b-b8f2-1670d9c6f266/util/0.log" Apr 20 21:46:53.630838 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:53.630812 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-tvwzd_2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd/manager/0.log" Apr 20 21:46:53.977622 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:53.977598 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-bp6zs_86e1a7e0-4997-4d3b-aac9-1505b67a27e3/registry-server/0.log" Apr 20 21:46:54.207212 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:54.207177 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-q6v89_bff4826b-070b-4dbc-898a-13349b2c56f9/limitador/0.log" Apr 20 21:46:54.795360 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:54.795334 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-h9k8p_502928a8-1d20-4943-a5c1-077a4b81c99e/discovery/0.log" Apr 20 21:46:54.908094 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:54.908066 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-b57dc9cf9-qjjgd_d2ea1a05-eff3-4922-9ed4-03c04ccf987c/kube-auth-proxy/0.log" Apr 20 21:46:55.709092 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:55.709063 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm_5d45e559-2ba8-4add-8d98-60760e253bf5/storage-initializer/0.log" Apr 20 21:46:55.715431 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:55.715400 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-nl5wm_5d45e559-2ba8-4add-8d98-60760e253bf5/main/0.log" Apr 20 21:46:55.823122 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:55.823094 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-hvhds_bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d/storage-initializer/0.log" Apr 20 21:46:55.829588 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:46:55.829570 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-hvhds_bf4041d1-6f6c-4b74-9d97-7d1e86d03e0d/main/0.log" Apr 20 21:47:02.859157 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:02.859128 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-blrzp_62995ee3-d913-46d1-a08a-f154a1b3137d/global-pull-secret-syncer/0.log" Apr 20 21:47:03.122906 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:03.122837 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wl9z5_7b16dac9-24c6-43b4-a23f-0a9c62fb7317/konnectivity-agent/0.log" Apr 20 21:47:03.172582 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:03.172556 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-57.ec2.internal_011bd93cb6528efda482582d85ad698c/haproxy/0.log" Apr 20 21:47:07.288145 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:07.288119 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw_837d9f42-ca4b-414b-b8f2-1670d9c6f266/extract/0.log" Apr 20 21:47:07.309361 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:07.309339 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw_837d9f42-ca4b-414b-b8f2-1670d9c6f266/util/0.log" Apr 20 21:47:07.329487 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:07.329467 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1m7cxw_837d9f42-ca4b-414b-b8f2-1670d9c6f266/pull/0.log" Apr 20 21:47:07.638291 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:07.638269 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-tvwzd_2d5adbef-9c5f-40fe-903a-5b7fc3f0c1bd/manager/0.log" Apr 20 21:47:07.721838 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:07.721811 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-bp6zs_86e1a7e0-4997-4d3b-aac9-1505b67a27e3/registry-server/0.log" Apr 20 21:47:07.795493 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:07.795472 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-q6v89_bff4826b-070b-4dbc-898a-13349b2c56f9/limitador/0.log" Apr 20 21:47:09.270542 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.270518 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3e2c9a4b-b674-44a4-bd07-72f35fda57b0/alertmanager/0.log" Apr 20 21:47:09.291411 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.291388 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3e2c9a4b-b674-44a4-bd07-72f35fda57b0/config-reloader/0.log" Apr 20 21:47:09.315689 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.315653 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3e2c9a4b-b674-44a4-bd07-72f35fda57b0/kube-rbac-proxy-web/0.log" Apr 20 21:47:09.335779 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.335755 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3e2c9a4b-b674-44a4-bd07-72f35fda57b0/kube-rbac-proxy/0.log" Apr 20 21:47:09.360258 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.360240 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3e2c9a4b-b674-44a4-bd07-72f35fda57b0/kube-rbac-proxy-metric/0.log" Apr 20 21:47:09.392368 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.392347 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3e2c9a4b-b674-44a4-bd07-72f35fda57b0/prom-label-proxy/0.log" Apr 20 21:47:09.412531 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.412513 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3e2c9a4b-b674-44a4-bd07-72f35fda57b0/init-config-reloader/0.log" Apr 20 21:47:09.710566 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.710546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wr4jc_e93c295d-891c-4a62-9dd6-ebb8b010f291/node-exporter/0.log" Apr 20 21:47:09.729239 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.729221 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wr4jc_e93c295d-891c-4a62-9dd6-ebb8b010f291/kube-rbac-proxy/0.log" Apr 20 21:47:09.750339 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:09.750321 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wr4jc_e93c295d-891c-4a62-9dd6-ebb8b010f291/init-textfile/0.log" Apr 20 21:47:10.081162 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.081088 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8ccc47d7c-frb8n_607c02c5-695a-4a3d-bf04-683531b59ce0/telemeter-client/0.log" Apr 20 21:47:10.100659 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.100632 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8ccc47d7c-frb8n_607c02c5-695a-4a3d-bf04-683531b59ce0/reload/0.log" Apr 20 21:47:10.120618 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.120595 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8ccc47d7c-frb8n_607c02c5-695a-4a3d-bf04-683531b59ce0/kube-rbac-proxy/0.log" Apr 20 21:47:10.145369 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.145348 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d47bc568f-zcckd_8748fbe8-d18a-419c-9436-b306ab28eaf7/thanos-query/0.log" Apr 20 21:47:10.166605 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.166586 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d47bc568f-zcckd_8748fbe8-d18a-419c-9436-b306ab28eaf7/kube-rbac-proxy-web/0.log" Apr 20 21:47:10.185855 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.185835 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d47bc568f-zcckd_8748fbe8-d18a-419c-9436-b306ab28eaf7/kube-rbac-proxy/0.log" Apr 20 21:47:10.205049 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.205028 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d47bc568f-zcckd_8748fbe8-d18a-419c-9436-b306ab28eaf7/prom-label-proxy/0.log" Apr 20 21:47:10.225832 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.225815 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d47bc568f-zcckd_8748fbe8-d18a-419c-9436-b306ab28eaf7/kube-rbac-proxy-rules/0.log" Apr 20 21:47:10.244782 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:10.244764 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d47bc568f-zcckd_8748fbe8-d18a-419c-9436-b306ab28eaf7/kube-rbac-proxy-metrics/0.log" Apr 20 21:47:11.294203 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.294176 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb"] Apr 20 21:47:11.300133 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.300108 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c" containerName="manager" Apr 20 21:47:11.300133 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.300131 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c" containerName="manager" Apr 20 21:47:11.300338 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.300227 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b880f2ee-83e2-41f3-a8e5-3d5c7a754c5c" containerName="manager" Apr 20 21:47:11.303844 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.303821 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb"] Apr 20 21:47:11.303961 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.303918 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.305968 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.305944 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rs84k\"/\"kube-root-ca.crt\"" Apr 20 21:47:11.306720 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.306701 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rs84k\"/\"default-dockercfg-b9scs\"" Apr 20 21:47:11.306780 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.306746 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rs84k\"/\"openshift-service-ca.crt\"" Apr 20 21:47:11.361101 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.361073 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7j22\" (UniqueName: \"kubernetes.io/projected/2107f2ab-b632-4eb6-bec5-15d1a3739145-kube-api-access-v7j22\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.361195 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.361105 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-proc\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.361195 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.361123 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-lib-modules\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.361195 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.361191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-sys\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.361295 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.361245 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-podres\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462023 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.461976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7j22\" (UniqueName: \"kubernetes.io/projected/2107f2ab-b632-4eb6-bec5-15d1a3739145-kube-api-access-v7j22\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462113 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.462028 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-proc\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462113 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.462052 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-lib-modules\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462113 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.462087 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-sys\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462240 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.462152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-podres\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462240 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.462159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-proc\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462240 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.462230 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-lib-modules\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462346 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.462246 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-sys\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.462346 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.462260 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2107f2ab-b632-4eb6-bec5-15d1a3739145-podres\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.469211 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.469191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7j22\" (UniqueName: \"kubernetes.io/projected/2107f2ab-b632-4eb6-bec5-15d1a3739145-kube-api-access-v7j22\") pod \"perf-node-gather-daemonset-gzvgb\" (UID: \"2107f2ab-b632-4eb6-bec5-15d1a3739145\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.614434 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.614415 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:11.731777 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.731752 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb"] Apr 20 21:47:11.734074 ip-10-0-129-57 kubenswrapper[2567]: W0420 21:47:11.734046 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2107f2ab_b632_4eb6_bec5_15d1a3739145.slice/crio-ffb42dbd4928253114a20132ff05ea527b879c4c834d6cebade66284b4f87373 WatchSource:0}: Error finding container ffb42dbd4928253114a20132ff05ea527b879c4c834d6cebade66284b4f87373: Status 404 returned error can't find the container with id ffb42dbd4928253114a20132ff05ea527b879c4c834d6cebade66284b4f87373 Apr 20 21:47:11.735724 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:11.735709 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:47:12.212780 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:12.212757 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-859ff44bcd-tlr2d_f721edfa-67f3-4f75-b7da-038490a83e97/console/0.log" Apr 20 21:47:12.280179 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:12.280154 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" event={"ID":"2107f2ab-b632-4eb6-bec5-15d1a3739145","Type":"ContainerStarted","Data":"878b4724b26bd3abfbb1ce0dbbf2db2fb85a3297e52d061d66f8df1701f06848"} Apr 20 21:47:12.280316 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:12.280183 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" event={"ID":"2107f2ab-b632-4eb6-bec5-15d1a3739145","Type":"ContainerStarted","Data":"ffb42dbd4928253114a20132ff05ea527b879c4c834d6cebade66284b4f87373"} Apr 20 21:47:12.280316 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:12.280272 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:12.294563 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:12.294526 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" podStartSLOduration=1.2945144960000001 podStartE2EDuration="1.294514496s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:47:12.292538632 +0000 UTC m=+2082.021507229" watchObservedRunningTime="2026-04-20 21:47:12.294514496 +0000 UTC m=+2082.023483139" Apr 20 21:47:13.402277 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:13.402247 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cnvwm_770298fa-c6f6-4828-8681-c2a0e5ebc1b5/dns/0.log" Apr 20 21:47:13.422423 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:13.422400 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cnvwm_770298fa-c6f6-4828-8681-c2a0e5ebc1b5/kube-rbac-proxy/0.log" Apr 20 21:47:13.567396 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:13.567365 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qfhjp_0b4898da-9e0d-4a11-bec8-8eba5efe7422/dns-node-resolver/0.log" Apr 20 21:47:14.067277 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:14.067254 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s6cxs_3b0c9c36-7f31-4319-bc80-862234ec47e6/node-ca/0.log" Apr 20 21:47:14.918599 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:14.918566 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-h9k8p_502928a8-1d20-4943-a5c1-077a4b81c99e/discovery/0.log" Apr 20 21:47:14.936302 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:14.936284 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-b57dc9cf9-qjjgd_d2ea1a05-eff3-4922-9ed4-03c04ccf987c/kube-auth-proxy/0.log" Apr 20 21:47:15.525628 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:15.525594 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mm5c7_dc6985aa-5589-44b0-97f0-b862837c4008/serve-healthcheck-canary/0.log" Apr 20 21:47:16.036318 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:16.036292 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmpf4_680afb98-3015-4cc5-8729-e0c66f98f554/kube-rbac-proxy/0.log" Apr 20 21:47:16.081382 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:16.081361 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmpf4_680afb98-3015-4cc5-8729-e0c66f98f554/exporter/0.log" Apr 20 21:47:16.134404 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:16.134381 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kmpf4_680afb98-3015-4cc5-8729-e0c66f98f554/extractor/0.log" Apr 20 21:47:17.945619 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:17.945593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-cxmzf_6ec111a7-2378-4ead-875a-06574d248b03/manager/0.log" Apr 20 21:47:17.974345 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:17.974319 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-667bfbcd57-qdzlz_1d959f44-a5fe-4e59-8922-88e4a1b5e167/maas-api/0.log" Apr 20 21:47:18.048438 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:18.048415 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-kp466_f82b7fb7-1330-451e-8bb2-65948b52b19c/manager/1.log" Apr 20 21:47:18.058940 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:18.058917 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-kp466_f82b7fb7-1330-451e-8bb2-65948b52b19c/manager/2.log" Apr 20 21:47:18.197416 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:18.197335 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-85fc55dd88-wpc79_640c891e-2faa-4e30-bdd1-e531b6ec685f/manager/0.log" Apr 20 21:47:18.294086 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:18.294063 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-gzvgb" Apr 20 21:47:19.296743 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:19.296719 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-lnxtf_b23b4958-e2b7-4866-90b5-761439aeea55/openshift-lws-operator/0.log" Apr 20 21:47:25.207531 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.207507 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqrxp_089c1db7-01a7-42ee-bf2b-a07303e05826/kube-multus-additional-cni-plugins/0.log" Apr 20 21:47:25.226980 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.226955 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqrxp_089c1db7-01a7-42ee-bf2b-a07303e05826/egress-router-binary-copy/0.log" Apr 20 21:47:25.245817 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.245792 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqrxp_089c1db7-01a7-42ee-bf2b-a07303e05826/cni-plugins/0.log" Apr 20 21:47:25.265652 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.265637 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqrxp_089c1db7-01a7-42ee-bf2b-a07303e05826/bond-cni-plugin/0.log" Apr 20 21:47:25.284798 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.284781 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqrxp_089c1db7-01a7-42ee-bf2b-a07303e05826/routeoverride-cni/0.log" Apr 20 21:47:25.306303 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.306285 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqrxp_089c1db7-01a7-42ee-bf2b-a07303e05826/whereabouts-cni-bincopy/0.log" Apr 20 21:47:25.324423 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.324401 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wqrxp_089c1db7-01a7-42ee-bf2b-a07303e05826/whereabouts-cni/0.log" Apr 20 21:47:25.376247 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.376226 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fww58_af220c26-aa9e-4624-b5ca-0581df206506/kube-multus/0.log" Apr 20 21:47:25.474683 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.474621 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vc5dw_03abd218-9d5d-4f78-9ff1-919c66c5417e/network-metrics-daemon/0.log" Apr 20 21:47:25.491868 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:25.491853 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vc5dw_03abd218-9d5d-4f78-9ff1-919c66c5417e/kube-rbac-proxy/0.log" Apr 20 21:47:26.493654 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:26.493578 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prfjl_670b3244-d038-4e79-8acc-575b465321dc/ovn-controller/0.log" Apr 20 21:47:26.519122 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:26.519095 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prfjl_670b3244-d038-4e79-8acc-575b465321dc/ovn-acl-logging/0.log" Apr 20 21:47:26.536694 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:26.536673 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prfjl_670b3244-d038-4e79-8acc-575b465321dc/kube-rbac-proxy-node/0.log" Apr 20 21:47:26.558557 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:26.558515 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prfjl_670b3244-d038-4e79-8acc-575b465321dc/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 21:47:26.582470 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:26.582450 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prfjl_670b3244-d038-4e79-8acc-575b465321dc/northd/0.log" Apr 20 21:47:26.603165 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:26.603151 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prfjl_670b3244-d038-4e79-8acc-575b465321dc/nbdb/0.log" Apr 20 21:47:26.623586 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:26.623567 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prfjl_670b3244-d038-4e79-8acc-575b465321dc/sbdb/0.log" Apr 20 21:47:26.710815 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:26.710795 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prfjl_670b3244-d038-4e79-8acc-575b465321dc/ovnkube-controller/0.log" Apr 20 21:47:28.053287 ip-10-0-129-57 kubenswrapper[2567]: I0420 21:47:28.053259 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rqns6_6fde3cd9-8c1d-4801-8eeb-c3bfd3815846/network-check-target-container/0.log"