Apr 23 01:10:12.927245 ip-10-0-135-74 systemd[1]: Starting Kubernetes Kubelet... Apr 23 01:10:13.341383 ip-10-0-135-74 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:10:13.341383 ip-10-0-135-74 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 01:10:13.341383 ip-10-0-135-74 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:10:13.341383 ip-10-0-135-74 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 01:10:13.341383 ip-10-0-135-74 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:10:13.343661 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.343585 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 01:10:13.347936 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347921 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347938 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347942 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347946 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347949 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347953 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347956 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347959 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347961 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347965 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:13.347967 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347967 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347971 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347988 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347992 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.347996 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348000 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348003 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348005 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348007 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348011 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348013 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348016 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348019 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348021 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348024 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348026 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348033 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348035 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348038 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348040 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:13.348203 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348042 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348045 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348053 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348056 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348060 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348064 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348067 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348070 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348072 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348075 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348077 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348079 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348082 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348084 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348087 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348090 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348092 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348095 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348098 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348101 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:13.348655 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348103 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348106 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348109 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348112 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348114 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348117 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348119 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348122 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348124 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348127 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348130 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348132 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348134 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348137 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348139 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348142 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348144 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348146 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348149 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348151 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:13.349167 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348154 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348156 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348158 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348161 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348166 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348169 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348172 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348177 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348179 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348182 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348184 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348186 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348189 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348192 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348194 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348197 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348541 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348545 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348548 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348551 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:13.349653 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348554 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348556 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348559 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348561 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348564 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348567 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348569 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348572 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348574 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348576 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348579 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348581 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348584 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348586 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348589 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348592 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348596 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348599 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348602 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348605 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:13.350128 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348608 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348610 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348613 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348615 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348618 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348620 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348623 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348625 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348628 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348630 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348632 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348635 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348637 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348640 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348642 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348644 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348647 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348650 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348652 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348655 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:13.350599 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348657 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348660 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348663 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348665 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348668 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348670 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348673 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348676 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348678 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348682 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348684 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348687 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348689 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348691 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348694 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348696 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348699 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348701 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348705 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:13.351084 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348708 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348711 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348714 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348717 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348719 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348721 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348724 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348727 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348730 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348732 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348736 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348738 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348740 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348743 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348745 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348748 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348750 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348752 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348755 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348757 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:13.351533 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348760 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348762 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.348765 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349447 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349462 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349468 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349473 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349477 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349480 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349484 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349489 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349492 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349495 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349499 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349502 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349505 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349508 2565 flags.go:64] FLAG: --cgroup-root="" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349511 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349514 2565 flags.go:64] FLAG: --client-ca-file="" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349517 2565 flags.go:64] FLAG: --cloud-config="" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349520 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349523 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349527 2565 flags.go:64] FLAG: --cluster-domain="" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349530 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 01:10:13.352039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349533 2565 flags.go:64] FLAG: --config-dir="" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349535 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349538 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349542 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349545 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349548 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349551 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349554 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349557 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349559 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349563 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349566 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349570 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349573 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349576 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349579 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349582 2565 flags.go:64] FLAG: --enable-server="true" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349585 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349589 2565 flags.go:64] FLAG: --event-burst="100" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349592 2565 flags.go:64] FLAG: --event-qps="50" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349595 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349597 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349601 2565 flags.go:64] FLAG: --eviction-hard="" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349604 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 01:10:13.352595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349607 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349610 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349613 2565 flags.go:64] FLAG: --eviction-soft="" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349616 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349619 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349622 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349625 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349628 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349631 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349633 2565 flags.go:64] FLAG: --feature-gates="" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349637 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349640 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349643 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349646 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349648 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349651 2565 flags.go:64] FLAG: --help="false" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349654 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349657 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349661 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349664 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349667 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349671 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349674 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 01:10:13.353161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349677 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349680 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349683 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349685 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349688 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349691 2565 flags.go:64] FLAG: --kube-reserved="" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349694 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349697 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349700 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349702 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349705 2565 flags.go:64] FLAG: --lock-file="" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349708 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349711 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349714 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349719 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349722 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349725 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349727 2565 flags.go:64] FLAG: --logging-format="text" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349730 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349734 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349737 2565 flags.go:64] FLAG: --manifest-url="" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349739 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349744 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349747 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349751 2565 flags.go:64] FLAG: --max-pods="110" Apr 23 01:10:13.353684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349753 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349756 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349759 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349762 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349765 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349767 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349771 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349778 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349781 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349784 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349786 2565 flags.go:64] FLAG: --pod-cidr="" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349790 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349795 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349798 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349801 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349804 2565 flags.go:64] FLAG: --port="10250" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349807 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349810 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a23468e91f67adc7" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349813 2565 flags.go:64] FLAG: --qos-reserved="" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349816 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349819 2565 flags.go:64] FLAG: --register-node="true" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349822 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349825 2565 flags.go:64] FLAG: --register-with-taints="" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349828 2565 flags.go:64] FLAG: --registry-burst="10" Apr 23 01:10:13.354305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349831 2565 flags.go:64] FLAG: --registry-qps="5" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349834 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349837 2565 flags.go:64] FLAG: --reserved-memory="" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349841 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349844 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349847 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349850 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349853 2565 flags.go:64] FLAG: --runonce="false" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349855 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349859 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349862 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349865 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349868 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349870 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349874 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349877 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349880 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349882 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349885 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349888 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349891 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349893 2565 flags.go:64] FLAG: --system-cgroups="" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349896 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349901 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349904 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 23 01:10:13.354870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349907 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349911 2565 flags.go:64] FLAG: --tls-min-version="" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349917 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349920 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349923 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349926 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349928 2565 flags.go:64] FLAG: --v="2" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349932 2565 flags.go:64] FLAG: --version="false" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349936 2565 flags.go:64] FLAG: --vmodule="" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349941 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.349944 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350047 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350052 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350055 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350058 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350060 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350063 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350065 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350068 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350071 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350073 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350076 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:13.355477 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350078 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350081 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350084 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350086 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350089 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350091 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350094 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350096 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350099 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350101 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350104 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350107 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350113 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350116 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350118 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350121 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350123 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350126 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350129 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:13.356099 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350132 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350135 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350138 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350140 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350143 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350146 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350149 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350151 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350155 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350158 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350161 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350165 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350167 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350170 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350173 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350176 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350179 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350182 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350185 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:13.356579 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350187 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350190 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350192 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350195 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350198 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350200 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350204 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350206 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350209 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350211 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350214 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350216 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350219 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350221 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350224 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350227 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350229 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350232 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350234 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350237 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:13.357051 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350239 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350241 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350244 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350247 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350249 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350252 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350254 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350257 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350260 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350262 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350265 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350267 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350270 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350272 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350275 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350277 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:13.357550 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.350280 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:13.357920 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.350864 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:10:13.358078 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.358060 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 01:10:13.358116 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.358079 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 01:10:13.358146 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358125 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:13.358146 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358130 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:13.358146 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358134 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:13.358146 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358138 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:13.358146 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358141 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:13.358146 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358144 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:13.358146 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358146 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:13.358146 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358150 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358153 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358155 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358158 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358160 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358163 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358166 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358168 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358171 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358174 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358176 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358179 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358182 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358184 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358187 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358189 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358192 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358194 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358197 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358199 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:13.358338 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358202 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358205 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358207 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358210 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358213 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358215 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358218 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358220 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358223 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358226 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358228 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358230 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358233 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358235 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358239 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358241 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358243 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358246 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358248 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358251 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:13.358801 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358253 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358256 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358258 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358261 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358263 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358265 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358269 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358273 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358276 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358278 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358281 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358283 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358286 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358289 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358291 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358293 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358296 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358299 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358301 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:13.359281 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358303 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358306 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358309 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358311 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358313 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358316 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358318 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358321 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358323 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358326 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358328 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358330 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358333 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358335 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358337 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358340 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358343 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358345 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358348 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:13.359726 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358352 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.358358 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358447 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358452 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358455 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358458 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358462 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358465 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358468 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358470 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358473 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358475 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358478 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358480 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358482 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358485 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:10:13.360177 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358487 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358489 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358492 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358495 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358497 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358499 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358502 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358504 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358506 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358509 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358511 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358513 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358516 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358518 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358521 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358523 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358525 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358528 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358530 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358533 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:10:13.360558 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358535 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358537 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358540 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358542 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358544 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358547 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358550 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358552 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358554 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358557 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358560 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358563 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358565 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358567 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358570 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358572 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358575 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358577 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358579 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:10:13.361039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358582 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358584 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358587 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358589 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358591 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358594 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358596 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358599 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358601 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358603 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358606 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358608 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358610 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358613 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358615 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358618 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358622 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358625 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358628 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358631 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:10:13.361484 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358633 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358636 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358639 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358642 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358645 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358647 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358650 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358652 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358655 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358658 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358660 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358662 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:13.358665 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.358669 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.359309 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 01:10:13.362043 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.361567 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 01:10:13.362416 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.362373 2565 server.go:1019] "Starting client certificate rotation" Apr 23 01:10:13.362493 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.362475 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 01:10:13.362543 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.362519 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 01:10:13.383730 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.383713 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 01:10:13.386263 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.386244 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 01:10:13.398825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.398804 2565 log.go:25] "Validated CRI v1 runtime API" Apr 23 01:10:13.404298 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.404281 2565 log.go:25] "Validated CRI v1 image API" Apr 23 01:10:13.406446 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.406429 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 01:10:13.410422 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.410399 2565 fs.go:135] Filesystem UUIDs: map[1d302c2e-9f47-4378-a6f2-59d2314b3554:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8dc70702-f61e-41b2-9f12-632561b486f6:/dev/nvme0n1p3] Apr 23 01:10:13.410475 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.410422 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 01:10:13.411693 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.411679 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 01:10:13.416124 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.416002 2565 manager.go:217] Machine: {Timestamp:2026-04-23 01:10:13.413912482 +0000 UTC m=+0.373765932 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200034 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27f9d151b3ea15783281ceea101263 SystemUUID:ec27f9d1-51b3-ea15-7832-81ceea101263 BootID:69854d7e-191b-4a88-aaf1-9328720f5776 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:36:49:8a:dc:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:36:49:8a:dc:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:67:ff:b9:55:4a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 01:10:13.416124 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.416116 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 01:10:13.416249 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.416183 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 01:10:13.417142 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.417117 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 01:10:13.417276 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.417143 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-74.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 01:10:13.417357 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.417285 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 01:10:13.417357 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.417294 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 01:10:13.417357 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.417306 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 01:10:13.417357 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.417319 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 01:10:13.418470 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.418459 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 23 01:10:13.418696 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.418687 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 01:10:13.421024 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.421014 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 23 01:10:13.421073 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.421028 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 01:10:13.421073 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.421039 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 01:10:13.421073 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.421050 2565 kubelet.go:397] "Adding apiserver pod source" Apr 23 01:10:13.421073 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.421060 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 01:10:13.421969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.421957 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 01:10:13.422027 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.421988 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 01:10:13.425296 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.425275 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 01:10:13.426442 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.426428 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 01:10:13.428118 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428107 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428125 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428130 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428136 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428142 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428148 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428154 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428159 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428167 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 01:10:13.428169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428173 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 01:10:13.428392 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428182 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 01:10:13.428392 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428191 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 01:10:13.428886 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428877 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 01:10:13.428915 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.428887 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 01:10:13.431935 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.431901 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 01:10:13.431935 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.431925 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-74.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 01:10:13.432098 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.431936 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-74.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 01:10:13.432098 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.431995 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jrpq9" Apr 23 01:10:13.432453 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.432436 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 01:10:13.432503 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.432477 2565 server.go:1295] "Started kubelet" Apr 23 01:10:13.432654 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.432602 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 01:10:13.432714 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.432686 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 01:10:13.432930 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.432909 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 01:10:13.433161 ip-10-0-135-74 systemd[1]: Started Kubernetes Kubelet. Apr 23 01:10:13.433661 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.433544 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 01:10:13.434791 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.434773 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jrpq9" Apr 23 01:10:13.437243 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.437223 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 23 01:10:13.440383 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.440366 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 01:10:13.440463 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.440386 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 01:10:13.441070 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.441053 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 01:10:13.441070 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.441055 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 01:10:13.441184 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.441080 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 01:10:13.441232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.441211 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 23 01:10:13.441232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.441222 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 23 01:10:13.441427 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.441366 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:13.443185 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.443166 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:13.443480 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.443462 2565 factory.go:153] Registering CRI-O factory Apr 23 01:10:13.443582 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.443512 2565 factory.go:223] Registration of the crio container factory successfully Apr 23 01:10:13.443582 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.443564 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 01:10:13.443582 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.443574 2565 factory.go:55] Registering systemd factory Apr 23 01:10:13.443582 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.443581 2565 factory.go:223] Registration of the systemd container factory successfully Apr 23 01:10:13.443735 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.443604 2565 factory.go:103] Registering Raw factory Apr 23 01:10:13.443735 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.443616 2565 manager.go:1196] Started watching for new ooms in manager Apr 23 01:10:13.443735 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.443672 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 01:10:13.444036 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.444024 2565 manager.go:319] Starting recovery of all containers Apr 23 01:10:13.445466 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.445445 2565 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-74.ec2.internal\" not found" node="ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.452401 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.452275 2565 manager.go:324] Recovery completed Apr 23 01:10:13.456714 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.456702 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:13.458960 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.458947 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:13.459043 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.458971 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:13.459043 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.458999 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:13.459479 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.459463 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 01:10:13.459529 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.459477 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 01:10:13.459529 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.459496 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 23 01:10:13.461478 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.461467 2565 policy_none.go:49] "None policy: Start" Apr 23 01:10:13.461522 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.461482 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 01:10:13.461522 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.461492 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.499162 2565 manager.go:341] "Starting Device Plugin manager" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.499192 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.499204 2565 server.go:85] "Starting device plugin registration server" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.499421 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.499435 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.499573 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.499641 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.499649 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.500966 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 01:10:13.514918 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.501073 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:13.553834 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.553803 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 01:10:13.554918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.554896 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 01:10:13.555008 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.554926 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 01:10:13.555008 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.554945 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 01:10:13.555008 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.554958 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 01:10:13.555163 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.555008 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 01:10:13.557691 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.557677 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:13.600384 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.600343 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:13.601459 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.601444 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:13.601522 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.601470 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:13.601522 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.601480 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:13.601522 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.601499 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.609541 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.609529 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.609582 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.609546 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-74.ec2.internal\": node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:13.622271 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.622253 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:13.655365 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.655337 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal"] Apr 23 01:10:13.655411 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.655396 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:13.656602 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.656579 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:13.656602 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.656604 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:13.656713 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.656614 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:13.658808 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.658797 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:13.658928 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.658917 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.658962 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.658940 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:13.659461 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.659442 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:13.659519 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.659468 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:13.659519 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.659447 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:13.659519 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.659507 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:13.659616 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.659480 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:13.659616 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.659523 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:13.661809 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.661794 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.661902 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.661819 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:10:13.662411 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.662398 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:10:13.662500 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.662424 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:10:13.662500 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.662438 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:10:13.681461 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.681439 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-74.ec2.internal\" not found" node="ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.685405 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.685388 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-74.ec2.internal\" not found" node="ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.723116 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.723095 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:13.743166 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.743147 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/62aaf4ba8ebf9ff33dbeeea8017ddd63-config\") pod \"kube-apiserver-proxy-ip-10-0-135-74.ec2.internal\" (UID: \"62aaf4ba8ebf9ff33dbeeea8017ddd63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.743239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.743176 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9cc53ca7c179912fa71960ac8d9c32e8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal\" (UID: \"9cc53ca7c179912fa71960ac8d9c32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.743239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.743193 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cc53ca7c179912fa71960ac8d9c32e8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal\" (UID: \"9cc53ca7c179912fa71960ac8d9c32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.823959 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.823939 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:13.844295 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.844274 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/62aaf4ba8ebf9ff33dbeeea8017ddd63-config\") pod \"kube-apiserver-proxy-ip-10-0-135-74.ec2.internal\" (UID: \"62aaf4ba8ebf9ff33dbeeea8017ddd63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.844295 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.844282 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/62aaf4ba8ebf9ff33dbeeea8017ddd63-config\") pod \"kube-apiserver-proxy-ip-10-0-135-74.ec2.internal\" (UID: \"62aaf4ba8ebf9ff33dbeeea8017ddd63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.844423 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.844305 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9cc53ca7c179912fa71960ac8d9c32e8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal\" (UID: \"9cc53ca7c179912fa71960ac8d9c32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.844423 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.844325 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cc53ca7c179912fa71960ac8d9c32e8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal\" (UID: \"9cc53ca7c179912fa71960ac8d9c32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.844423 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.844359 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cc53ca7c179912fa71960ac8d9c32e8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal\" (UID: \"9cc53ca7c179912fa71960ac8d9c32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.844423 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.844365 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9cc53ca7c179912fa71960ac8d9c32e8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal\" (UID: \"9cc53ca7c179912fa71960ac8d9c32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.924615 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:13.924576 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:13.983110 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.983088 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" Apr 23 01:10:13.988274 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:13.988261 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:14.025565 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:14.025544 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:14.126074 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:14.126059 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:14.226551 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:14.226507 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:14.327111 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:14.327095 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:14.343737 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.343719 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:14.362580 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.362561 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 01:10:14.362684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.362666 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 01:10:14.362721 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.362671 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 01:10:14.362721 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.362672 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 01:10:14.428038 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:14.428012 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:14.436991 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.436950 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 01:05:13 +0000 UTC" deadline="2028-01-17 12:47:39.36975588 +0000 UTC" Apr 23 01:10:14.436991 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.436988 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15227h37m24.932783595s" Apr 23 01:10:14.440807 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.440784 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 01:10:14.453750 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.453726 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 01:10:14.476488 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.476465 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4fbk2" Apr 23 01:10:14.486148 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.486131 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4fbk2" Apr 23 01:10:14.489858 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:14.489835 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62aaf4ba8ebf9ff33dbeeea8017ddd63.slice/crio-61e1be7978bb054757f7b32df9a5d660ac88b8695323e619cfa8140c20ed474f WatchSource:0}: Error finding container 61e1be7978bb054757f7b32df9a5d660ac88b8695323e619cfa8140c20ed474f: Status 404 returned error can't find the container with id 61e1be7978bb054757f7b32df9a5d660ac88b8695323e619cfa8140c20ed474f Apr 23 01:10:14.490103 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:14.490084 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc53ca7c179912fa71960ac8d9c32e8.slice/crio-b95384cb2a37d2212248d10e4bb545af877f7635d6a0d0307aa2994f4108c33b WatchSource:0}: Error finding container b95384cb2a37d2212248d10e4bb545af877f7635d6a0d0307aa2994f4108c33b: Status 404 returned error can't find the container with id b95384cb2a37d2212248d10e4bb545af877f7635d6a0d0307aa2994f4108c33b Apr 23 01:10:14.493597 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.493584 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:10:14.528265 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:14.528245 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:14.557202 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.557162 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" event={"ID":"62aaf4ba8ebf9ff33dbeeea8017ddd63","Type":"ContainerStarted","Data":"61e1be7978bb054757f7b32df9a5d660ac88b8695323e619cfa8140c20ed474f"} Apr 23 01:10:14.558024 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.558003 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" event={"ID":"9cc53ca7c179912fa71960ac8d9c32e8","Type":"ContainerStarted","Data":"b95384cb2a37d2212248d10e4bb545af877f7635d6a0d0307aa2994f4108c33b"} Apr 23 01:10:14.629190 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:14.629170 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-74.ec2.internal\" not found" Apr 23 01:10:14.648819 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.648802 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:14.740553 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.740506 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" Apr 23 01:10:14.749799 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.749782 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 01:10:14.750504 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.750493 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" Apr 23 01:10:14.761024 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:14.761006 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 01:10:15.276803 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.276771 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:15.422261 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.422233 2565 apiserver.go:52] "Watching apiserver" Apr 23 01:10:15.430538 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.430515 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 01:10:15.433858 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.432221 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr","openshift-dns/node-resolver-twhzp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal","openshift-multus/multus-additional-cni-plugins-g4vqs","openshift-multus/network-metrics-daemon-ps42z","kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal","openshift-cluster-node-tuning-operator/tuned-6xmb8","openshift-image-registry/node-ca-rbjs6","openshift-multus/multus-rgvmw","openshift-network-diagnostics/network-check-target-vvtxz","openshift-network-operator/iptables-alerter-k54d2","openshift-ovn-kubernetes/ovnkube-node-xt5wx","kube-system/konnectivity-agent-xg5xx"] Apr 23 01:10:15.435015 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.434999 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.437073 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.437051 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.437728 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.437596 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 01:10:15.437728 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.437606 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:10:15.437728 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.437652 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2v2bq\"" Apr 23 01:10:15.439328 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.439307 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.439436 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.439419 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bmzbz\"" Apr 23 01:10:15.439666 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.439646 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 01:10:15.439818 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.439748 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 01:10:15.441490 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.441473 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 01:10:15.441587 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.441570 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 01:10:15.441801 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.441758 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 01:10:15.441912 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.441892 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 01:10:15.442003 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.441946 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 01:10:15.442003 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.441949 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ftltm\"" Apr 23 01:10:15.443813 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.443798 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:15.443898 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:15.443872 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:15.446092 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.446065 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.448245 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.448226 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.448623 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.448522 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 01:10:15.449239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.448808 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j4s9c\"" Apr 23 01:10:15.449239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.448888 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 01:10:15.449239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.448949 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 01:10:15.450463 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.450443 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.450546 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.450488 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:15.450610 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:15.450556 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:15.450831 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.450769 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 01:10:15.450947 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.450929 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 01:10:15.451015 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.450934 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wfw99\"" Apr 23 01:10:15.451347 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.451132 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 01:10:15.452226 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452203 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg2dg\" (UniqueName: \"kubernetes.io/projected/100ffad2-0adc-4293-8bc1-c64fdc753f08-kube-api-access-sg2dg\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.452318 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452239 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj76j\" (UniqueName: \"kubernetes.io/projected/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-kube-api-access-zj76j\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:15.452318 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452264 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysctl-conf\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452318 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452288 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-os-release\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.452318 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452311 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.452513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452350 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.452513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452374 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/100ffad2-0adc-4293-8bc1-c64fdc753f08-hosts-file\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.452513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452398 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghlw7\" (UniqueName: \"kubernetes.io/projected/df6051db-2c86-4881-a85a-a58d1ac659bd-kube-api-access-ghlw7\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452437 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-system-cni-dir\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.452513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452473 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgrjq\" (UniqueName: \"kubernetes.io/projected/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-kube-api-access-wgrjq\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.452513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452493 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cnibin\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.452513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452508 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452524 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-host\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452567 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df6051db-2c86-4881-a85a-a58d1ac659bd-tmp\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452596 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-kubernetes\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452611 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysctl-d\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452627 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-systemd\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452655 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-run\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452676 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-lib-modules\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452700 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452724 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-var-lib-kubelet\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452747 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452783 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-modprobe-d\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452807 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysconfig\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.452832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452829 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-sys\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.453417 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452855 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-tuned\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.453417 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.452871 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/100ffad2-0adc-4293-8bc1-c64fdc753f08-tmp-dir\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.453417 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.453163 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 01:10:15.453417 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.453210 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vk6fd\"" Apr 23 01:10:15.453592 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.453568 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.455828 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.455807 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:10:15.455936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.455892 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 01:10:15.456085 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.456068 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.456179 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.456155 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vssrh\"" Apr 23 01:10:15.456671 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.456654 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 01:10:15.458222 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.458199 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:15.458717 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.458702 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 01:10:15.460632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.460423 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 01:10:15.460712 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.460665 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 01:10:15.460931 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.460885 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 01:10:15.461074 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.461061 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qtv6m\"" Apr 23 01:10:15.461620 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.461196 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-kl525\"" Apr 23 01:10:15.461709 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.461641 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 01:10:15.461769 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.461754 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 01:10:15.463207 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.461843 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 01:10:15.463207 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.461998 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 01:10:15.486857 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.486837 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 01:05:14 +0000 UTC" deadline="2027-11-25 01:38:34.824923952 +0000 UTC" Apr 23 01:10:15.486857 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.486856 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13944h28m19.338070084s" Apr 23 01:10:15.541936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.541888 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 01:10:15.553297 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553268 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.553403 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553308 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2bq\" (UniqueName: \"kubernetes.io/projected/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-kube-api-access-wz2bq\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.553403 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553334 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-netns\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.553403 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553382 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-cni-bin\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.553537 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553417 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndtbq\" (UniqueName: \"kubernetes.io/projected/09a44761-f9fa-463b-86c4-dacaca4d17a8-kube-api-access-ndtbq\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.553537 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553463 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.553537 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553492 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-systemd\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.553537 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553515 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-node-log\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553541 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-cni-netd\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553581 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-ovnkube-config\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553606 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-sys\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553630 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-tuned\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553656 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg2dg\" (UniqueName: \"kubernetes.io/projected/100ffad2-0adc-4293-8bc1-c64fdc753f08-kube-api-access-sg2dg\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553681 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj76j\" (UniqueName: \"kubernetes.io/projected/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-kube-api-access-zj76j\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553687 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-sys\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553707 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-ovn\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553732 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysctl-conf\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.553756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553756 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553785 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553809 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553825 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553836 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/100ffad2-0adc-4293-8bc1-c64fdc753f08-hosts-file\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553861 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-serviceca\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553885 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553910 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-run-ovn-kubernetes\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553922 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/100ffad2-0adc-4293-8bc1-c64fdc753f08-hosts-file\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553927 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.553957 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgrjq\" (UniqueName: \"kubernetes.io/projected/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-kube-api-access-wgrjq\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554013 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-socket-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554048 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/37eabc47-849b-40f9-bac1-f73b5c3d4329-agent-certs\") pod \"konnectivity-agent-xg5xx\" (UID: \"37eabc47-849b-40f9-bac1-f73b5c3d4329\") " pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554073 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-cnibin\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554097 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-socket-dir-parent\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554120 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-conf-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554143 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-hostroot\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.554256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554168 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-run-netns\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554190 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-cni-bin\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554203 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysctl-conf\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554213 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbn7l\" (UniqueName: \"kubernetes.io/projected/6a5144da-faff-4d89-b9c9-baf899b2a716-kube-api-access-mbn7l\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554240 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df6051db-2c86-4881-a85a-a58d1ac659bd-tmp\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554266 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-etc-selinux\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554374 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-host\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554407 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554414 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-cni-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554463 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-kubelet\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554484 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-log-socket\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554511 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-kubernetes\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554536 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-systemd\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554561 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554580 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxl6\" (UniqueName: \"kubernetes.io/projected/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-kube-api-access-wnxl6\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554599 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-systemd\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554588 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-kubernetes\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.554969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554620 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-system-cni-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554647 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a5144da-faff-4d89-b9c9-baf899b2a716-ovn-node-metrics-cert\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554676 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-var-lib-kubelet\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554738 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-var-lib-kubelet\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554774 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-registration-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554799 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-sys-fs\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554825 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-multus-certs\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554852 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-slash\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554878 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-ovnkube-script-lib\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554904 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-modprobe-d\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.554964 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysconfig\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555004 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/100ffad2-0adc-4293-8bc1-c64fdc753f08-tmp-dir\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555030 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-env-overrides\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-os-release\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555079 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-modprobe-d\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555120 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysconfig\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555081 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555171 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-os-release\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555182 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-etc-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555212 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-device-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555238 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555236 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-cni-multus\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555284 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghlw7\" (UniqueName: \"kubernetes.io/projected/df6051db-2c86-4881-a85a-a58d1ac659bd-kube-api-access-ghlw7\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555310 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-system-cni-dir\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555333 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/100ffad2-0adc-4293-8bc1-c64fdc753f08-tmp-dir\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555344 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-system-cni-dir\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555390 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-iptables-alerter-script\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555416 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-os-release\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555440 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-var-lib-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cnibin\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555493 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555516 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-host-slash\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555517 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-cnibin\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555573 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-kubelet\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.556632 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555600 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-systemd-units\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555633 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555658 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-k8s-cni-cncf-io\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:15.555676 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555679 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a44761-f9fa-463b-86c4-dacaca4d17a8-cni-binary-copy\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:15.555772 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs podName:14afdf01-fa2e-4563-8fbf-0cc2613b39ba nodeName:}" failed. No retries permitted until 2026-04-23 01:10:16.055730751 +0000 UTC m=+3.015584215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs") pod "network-metrics-daemon-ps42z" (UID: "14afdf01-fa2e-4563-8fbf-0cc2613b39ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555803 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-host\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555831 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln49g\" (UniqueName: \"kubernetes.io/projected/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-kube-api-access-ln49g\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555859 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/37eabc47-849b-40f9-bac1-f73b5c3d4329-konnectivity-ca\") pod \"konnectivity-agent-xg5xx\" (UID: \"37eabc47-849b-40f9-bac1-f73b5c3d4329\") " pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555893 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-daemon-config\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555891 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-host\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555924 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-etc-kubernetes\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555960 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysctl-d\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.555996 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-run\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.556014 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-lib-modules\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.556067 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-run\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.556072 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-sysctl-d\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.557444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.556149 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df6051db-2c86-4881-a85a-a58d1ac659bd-lib-modules\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.558137 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.557320 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/df6051db-2c86-4881-a85a-a58d1ac659bd-etc-tuned\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.558137 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.557781 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df6051db-2c86-4881-a85a-a58d1ac659bd-tmp\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.567199 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.567162 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg2dg\" (UniqueName: \"kubernetes.io/projected/100ffad2-0adc-4293-8bc1-c64fdc753f08-kube-api-access-sg2dg\") pod \"node-resolver-twhzp\" (UID: \"100ffad2-0adc-4293-8bc1-c64fdc753f08\") " pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.567330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.567303 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgrjq\" (UniqueName: \"kubernetes.io/projected/54d9175d-7498-4b1f-8e42-2c7b5a37d2f4-kube-api-access-wgrjq\") pod \"multus-additional-cni-plugins-g4vqs\" (UID: \"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4\") " pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.567601 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.567569 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghlw7\" (UniqueName: \"kubernetes.io/projected/df6051db-2c86-4881-a85a-a58d1ac659bd-kube-api-access-ghlw7\") pod \"tuned-6xmb8\" (UID: \"df6051db-2c86-4881-a85a-a58d1ac659bd\") " pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.567851 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.567833 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj76j\" (UniqueName: \"kubernetes.io/projected/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-kube-api-access-zj76j\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:15.656422 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656360 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-k8s-cni-cncf-io\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656451 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a44761-f9fa-463b-86c4-dacaca4d17a8-cni-binary-copy\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656478 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln49g\" (UniqueName: \"kubernetes.io/projected/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-kube-api-access-ln49g\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.656573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656503 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/37eabc47-849b-40f9-bac1-f73b5c3d4329-konnectivity-ca\") pod \"konnectivity-agent-xg5xx\" (UID: \"37eabc47-849b-40f9-bac1-f73b5c3d4329\") " pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:15.656573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656511 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-k8s-cni-cncf-io\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656474 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656528 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-daemon-config\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656588 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-etc-kubernetes\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656640 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-etc-kubernetes\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656666 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2bq\" (UniqueName: \"kubernetes.io/projected/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-kube-api-access-wz2bq\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656690 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-netns\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656713 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-cni-bin\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656737 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndtbq\" (UniqueName: \"kubernetes.io/projected/09a44761-f9fa-463b-86c4-dacaca4d17a8-kube-api-access-ndtbq\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656760 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-systemd\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656783 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-node-log\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656794 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-cni-bin\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656814 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-systemd\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656806 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-cni-netd\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656835 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-cni-netd\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656782 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-netns\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656852 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-ovnkube-config\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656857 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-node-log\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.656901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656882 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-ovn\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656933 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656958 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656963 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-ovn\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.656998 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-serviceca\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657016 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-run-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657019 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657025 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657066 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-run-ovn-kubernetes\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657088 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/37eabc47-849b-40f9-bac1-f73b5c3d4329-konnectivity-ca\") pod \"konnectivity-agent-xg5xx\" (UID: \"37eabc47-849b-40f9-bac1-f73b5c3d4329\") " pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657094 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-socket-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657144 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/37eabc47-849b-40f9-bac1-f73b5c3d4329-agent-certs\") pod \"konnectivity-agent-xg5xx\" (UID: \"37eabc47-849b-40f9-bac1-f73b5c3d4329\") " pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657169 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-cnibin\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657192 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-socket-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657189 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-run-ovn-kubernetes\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657215 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-socket-dir-parent\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657256 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-conf-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657249 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-cnibin\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.657639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657278 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-hostroot\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657303 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-run-netns\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657314 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-socket-dir-parent\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657322 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-conf-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657326 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-cni-bin\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657354 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-hostroot\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657360 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-cni-bin\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657366 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbn7l\" (UniqueName: \"kubernetes.io/projected/6a5144da-faff-4d89-b9c9-baf899b2a716-kube-api-access-mbn7l\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657391 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-etc-selinux\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657397 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-run-netns\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657414 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-host\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657429 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-ovnkube-config\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657437 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-cni-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657472 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-serviceca\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657485 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-host\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657498 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-cni-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657513 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-kubelet\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657530 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-etc-selinux\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.658438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657539 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-log-socket\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657567 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxl6\" (UniqueName: \"kubernetes.io/projected/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-kube-api-access-wnxl6\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657573 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-kubelet\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657606 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-log-socket\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657667 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-system-cni-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657695 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a5144da-faff-4d89-b9c9-baf899b2a716-ovn-node-metrics-cert\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657722 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-registration-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657729 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-system-cni-dir\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657745 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-sys-fs\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657786 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-multus-certs\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657797 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-registration-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657805 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-sys-fs\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657814 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-slash\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657840 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-run-multus-certs\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657850 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-ovnkube-script-lib\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657870 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-host-slash\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657898 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-env-overrides\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.657999 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-etc-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658028 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-device-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658054 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-cni-multus\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658080 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-iptables-alerter-script\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658107 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-os-release\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658121 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-device-dir\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658130 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-var-lib-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658170 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-host-slash\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658179 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-etc-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658193 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-kubelet\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658219 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-systemd-units\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658232 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-os-release\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658283 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-var-lib-openvswitch\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658290 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-host-slash\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658317 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a5144da-faff-4d89-b9c9-baf899b2a716-systemd-units\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658327 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-cni-multus\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658358 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-ovnkube-script-lib\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658364 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a44761-f9fa-463b-86c4-dacaca4d17a8-host-var-lib-kubelet\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.659961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658401 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09a44761-f9fa-463b-86c4-dacaca4d17a8-multus-daemon-config\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.660780 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658733 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a5144da-faff-4d89-b9c9-baf899b2a716-env-overrides\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.660780 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658820 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-iptables-alerter-script\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.660780 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.658967 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09a44761-f9fa-463b-86c4-dacaca4d17a8-cni-binary-copy\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.660780 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.660162 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/37eabc47-849b-40f9-bac1-f73b5c3d4329-agent-certs\") pod \"konnectivity-agent-xg5xx\" (UID: \"37eabc47-849b-40f9-bac1-f73b5c3d4329\") " pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:15.660780 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.660697 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a5144da-faff-4d89-b9c9-baf899b2a716-ovn-node-metrics-cert\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.662257 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:15.662240 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:15.662361 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:15.662261 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:15.662361 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:15.662275 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kxpct for pod openshift-network-diagnostics/network-check-target-vvtxz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:15.662361 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:15.662346 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct podName:2b220af6-5884-49f2-943b-c16a83a47800 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:16.162330103 +0000 UTC m=+3.122183544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpct" (UniqueName: "kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct") pod "network-check-target-vvtxz" (UID: "2b220af6-5884-49f2-943b-c16a83a47800") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:15.665087 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.665063 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbn7l\" (UniqueName: \"kubernetes.io/projected/6a5144da-faff-4d89-b9c9-baf899b2a716-kube-api-access-mbn7l\") pod \"ovnkube-node-xt5wx\" (UID: \"6a5144da-faff-4d89-b9c9-baf899b2a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.665197 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.665167 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxl6\" (UniqueName: \"kubernetes.io/projected/e4fb88ad-bf54-408b-b0bc-28ff6f866ea2-kube-api-access-wnxl6\") pod \"iptables-alerter-k54d2\" (UID: \"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2\") " pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.665602 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.665574 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndtbq\" (UniqueName: \"kubernetes.io/projected/09a44761-f9fa-463b-86c4-dacaca4d17a8-kube-api-access-ndtbq\") pod \"multus-rgvmw\" (UID: \"09a44761-f9fa-463b-86c4-dacaca4d17a8\") " pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.666000 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.665954 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln49g\" (UniqueName: \"kubernetes.io/projected/8e55bc37-b8b1-4a41-8534-2acf9f5535eb-kube-api-access-ln49g\") pod \"aws-ebs-csi-driver-node-zwfwr\" (UID: \"8e55bc37-b8b1-4a41-8534-2acf9f5535eb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.666476 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.666447 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2bq\" (UniqueName: \"kubernetes.io/projected/318c5767-f4ad-4937-bccb-ef0c86ed7ff7-kube-api-access-wz2bq\") pod \"node-ca-rbjs6\" (UID: \"318c5767-f4ad-4937-bccb-ef0c86ed7ff7\") " pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.745299 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.745275 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" Apr 23 01:10:15.752816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.752799 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-twhzp" Apr 23 01:10:15.763004 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.762967 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" Apr 23 01:10:15.765299 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.765273 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf6051db_2c86_4881_a85a_a58d1ac659bd.slice/crio-ecdc13d2510fb70c4fc62acb71617d43a6c7606af5f63772f5731c9706d38f26 WatchSource:0}: Error finding container ecdc13d2510fb70c4fc62acb71617d43a6c7606af5f63772f5731c9706d38f26: Status 404 returned error can't find the container with id ecdc13d2510fb70c4fc62acb71617d43a6c7606af5f63772f5731c9706d38f26 Apr 23 01:10:15.766427 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.766402 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod100ffad2_0adc_4293_8bc1_c64fdc753f08.slice/crio-4c69e2157266966752072bd640a9384770ffe097d35480f6d86ad60734a64ade WatchSource:0}: Error finding container 4c69e2157266966752072bd640a9384770ffe097d35480f6d86ad60734a64ade: Status 404 returned error can't find the container with id 4c69e2157266966752072bd640a9384770ffe097d35480f6d86ad60734a64ade Apr 23 01:10:15.767961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.767933 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" Apr 23 01:10:15.773315 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.773282 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d9175d_7498_4b1f_8e42_2c7b5a37d2f4.slice/crio-72caf24e0a299755cff463e115a9fa4fe74a342fd5b4176000fe34f650a99bbd WatchSource:0}: Error finding container 72caf24e0a299755cff463e115a9fa4fe74a342fd5b4176000fe34f650a99bbd: Status 404 returned error can't find the container with id 72caf24e0a299755cff463e115a9fa4fe74a342fd5b4176000fe34f650a99bbd Apr 23 01:10:15.774663 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.774644 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rbjs6" Apr 23 01:10:15.775990 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.775950 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e55bc37_b8b1_4a41_8534_2acf9f5535eb.slice/crio-358baecb099d2cdea4585cf36f5d8a7e521dd5da4398d5e56e438fb78911302b WatchSource:0}: Error finding container 358baecb099d2cdea4585cf36f5d8a7e521dd5da4398d5e56e438fb78911302b: Status 404 returned error can't find the container with id 358baecb099d2cdea4585cf36f5d8a7e521dd5da4398d5e56e438fb78911302b Apr 23 01:10:15.780483 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.780464 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rgvmw" Apr 23 01:10:15.781068 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.781048 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod318c5767_f4ad_4937_bccb_ef0c86ed7ff7.slice/crio-553d7189e102ae55bf1a329f8758d476ea06b38e49350b31010f8408a82cdc22 WatchSource:0}: Error finding container 553d7189e102ae55bf1a329f8758d476ea06b38e49350b31010f8408a82cdc22: Status 404 returned error can't find the container with id 553d7189e102ae55bf1a329f8758d476ea06b38e49350b31010f8408a82cdc22 Apr 23 01:10:15.788701 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.788682 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k54d2" Apr 23 01:10:15.789993 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.789958 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a44761_f9fa_463b_86c4_dacaca4d17a8.slice/crio-072d94753b388d9d46cb471bc98693e69cb6923ffa110b7c81619f1d8b6cd5ac WatchSource:0}: Error finding container 072d94753b388d9d46cb471bc98693e69cb6923ffa110b7c81619f1d8b6cd5ac: Status 404 returned error can't find the container with id 072d94753b388d9d46cb471bc98693e69cb6923ffa110b7c81619f1d8b6cd5ac Apr 23 01:10:15.794460 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.794441 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:15.797214 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.797188 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4fb88ad_bf54_408b_b0bc_28ff6f866ea2.slice/crio-5c1d29665d934d201fd800f4127a44aa3debba29661d6c194fb27d627c00d432 WatchSource:0}: Error finding container 5c1d29665d934d201fd800f4127a44aa3debba29661d6c194fb27d627c00d432: Status 404 returned error can't find the container with id 5c1d29665d934d201fd800f4127a44aa3debba29661d6c194fb27d627c00d432 Apr 23 01:10:15.798924 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.798762 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:15.803392 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.803225 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5144da_faff_4d89_b9c9_baf899b2a716.slice/crio-c4086c151b418a3bce43eb8fb8bf13958bce7d2586495fe3082173c749133aa2 WatchSource:0}: Error finding container c4086c151b418a3bce43eb8fb8bf13958bce7d2586495fe3082173c749133aa2: Status 404 returned error can't find the container with id c4086c151b418a3bce43eb8fb8bf13958bce7d2586495fe3082173c749133aa2 Apr 23 01:10:15.812761 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:15.812736 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37eabc47_849b_40f9_bac1_f73b5c3d4329.slice/crio-f2fbaf6fb71e3c3725d81d192fba3983903aed958da2ca7bf78fb20cd3247657 WatchSource:0}: Error finding container f2fbaf6fb71e3c3725d81d192fba3983903aed958da2ca7bf78fb20cd3247657: Status 404 returned error can't find the container with id f2fbaf6fb71e3c3725d81d192fba3983903aed958da2ca7bf78fb20cd3247657 Apr 23 01:10:15.862113 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:15.862095 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:16.061587 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.061513 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:16.061689 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:16.061602 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:16.061689 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:16.061645 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs podName:14afdf01-fa2e-4563-8fbf-0cc2613b39ba nodeName:}" failed. No retries permitted until 2026-04-23 01:10:17.061631708 +0000 UTC m=+4.021485145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs") pod "network-metrics-daemon-ps42z" (UID: "14afdf01-fa2e-4563-8fbf-0cc2613b39ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:16.232306 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.232278 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:10:16.262786 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.262755 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:16.262927 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:16.262907 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:16.263018 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:16.262934 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:16.263018 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:16.262951 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kxpct for pod openshift-network-diagnostics/network-check-target-vvtxz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:16.263018 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:16.263016 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct podName:2b220af6-5884-49f2-943b-c16a83a47800 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:17.262998125 +0000 UTC m=+4.222851579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpct" (UniqueName: "kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct") pod "network-check-target-vvtxz" (UID: "2b220af6-5884-49f2-943b-c16a83a47800") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:16.487065 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.486968 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 01:05:14 +0000 UTC" deadline="2027-12-20 11:44:22.55769163 +0000 UTC" Apr 23 01:10:16.487065 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.487019 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14554h34m6.070676293s" Apr 23 01:10:16.577502 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.577466 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"c4086c151b418a3bce43eb8fb8bf13958bce7d2586495fe3082173c749133aa2"} Apr 23 01:10:16.588836 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.588807 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k54d2" event={"ID":"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2","Type":"ContainerStarted","Data":"5c1d29665d934d201fd800f4127a44aa3debba29661d6c194fb27d627c00d432"} Apr 23 01:10:16.599673 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.599600 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgvmw" event={"ID":"09a44761-f9fa-463b-86c4-dacaca4d17a8","Type":"ContainerStarted","Data":"072d94753b388d9d46cb471bc98693e69cb6923ffa110b7c81619f1d8b6cd5ac"} Apr 23 01:10:16.612848 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.611196 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rbjs6" event={"ID":"318c5767-f4ad-4937-bccb-ef0c86ed7ff7","Type":"ContainerStarted","Data":"553d7189e102ae55bf1a329f8758d476ea06b38e49350b31010f8408a82cdc22"} Apr 23 01:10:16.617724 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.617684 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" event={"ID":"8e55bc37-b8b1-4a41-8534-2acf9f5535eb","Type":"ContainerStarted","Data":"358baecb099d2cdea4585cf36f5d8a7e521dd5da4398d5e56e438fb78911302b"} Apr 23 01:10:16.625722 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.625698 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerStarted","Data":"72caf24e0a299755cff463e115a9fa4fe74a342fd5b4176000fe34f650a99bbd"} Apr 23 01:10:16.639152 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.639091 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-twhzp" event={"ID":"100ffad2-0adc-4293-8bc1-c64fdc753f08","Type":"ContainerStarted","Data":"4c69e2157266966752072bd640a9384770ffe097d35480f6d86ad60734a64ade"} Apr 23 01:10:16.650828 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.650319 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" event={"ID":"62aaf4ba8ebf9ff33dbeeea8017ddd63","Type":"ContainerStarted","Data":"f07e23e3d81672fafee30657030fabb106ef644ebbfee0bd3c9d96dce505d530"} Apr 23 01:10:16.656393 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.656374 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xg5xx" event={"ID":"37eabc47-849b-40f9-bac1-f73b5c3d4329","Type":"ContainerStarted","Data":"f2fbaf6fb71e3c3725d81d192fba3983903aed958da2ca7bf78fb20cd3247657"} Apr 23 01:10:16.659565 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.659534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" event={"ID":"df6051db-2c86-4881-a85a-a58d1ac659bd","Type":"ContainerStarted","Data":"ecdc13d2510fb70c4fc62acb71617d43a6c7606af5f63772f5731c9706d38f26"} Apr 23 01:10:16.664582 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:16.664534 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-74.ec2.internal" podStartSLOduration=2.664520843 podStartE2EDuration="2.664520843s" podCreationTimestamp="2026-04-23 01:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:10:16.664230167 +0000 UTC m=+3.624083626" watchObservedRunningTime="2026-04-23 01:10:16.664520843 +0000 UTC m=+3.624374303" Apr 23 01:10:17.068683 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:17.068643 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:17.068850 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:17.068830 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:17.068922 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:17.068899 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs podName:14afdf01-fa2e-4563-8fbf-0cc2613b39ba nodeName:}" failed. No retries permitted until 2026-04-23 01:10:19.06888103 +0000 UTC m=+6.028734472 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs") pod "network-metrics-daemon-ps42z" (UID: "14afdf01-fa2e-4563-8fbf-0cc2613b39ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:17.270250 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:17.270211 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:17.270427 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:17.270411 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:17.270492 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:17.270429 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:17.270492 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:17.270442 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kxpct for pod openshift-network-diagnostics/network-check-target-vvtxz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:17.270590 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:17.270499 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct podName:2b220af6-5884-49f2-943b-c16a83a47800 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:19.270480455 +0000 UTC m=+6.230333898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpct" (UniqueName: "kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct") pod "network-check-target-vvtxz" (UID: "2b220af6-5884-49f2-943b-c16a83a47800") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:17.558011 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:17.557888 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:17.558408 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:17.558039 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:17.558473 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:17.558430 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:17.558528 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:17.558509 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:19.087307 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:19.087252 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:19.087774 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:19.087388 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:19.087774 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:19.087446 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs podName:14afdf01-fa2e-4563-8fbf-0cc2613b39ba nodeName:}" failed. No retries permitted until 2026-04-23 01:10:23.087426429 +0000 UTC m=+10.047279870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs") pod "network-metrics-daemon-ps42z" (UID: "14afdf01-fa2e-4563-8fbf-0cc2613b39ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:19.289631 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:19.289596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:19.289787 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:19.289750 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:19.289787 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:19.289766 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:19.289787 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:19.289779 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kxpct for pod openshift-network-diagnostics/network-check-target-vvtxz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:19.289948 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:19.289829 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct podName:2b220af6-5884-49f2-943b-c16a83a47800 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:23.289811749 +0000 UTC m=+10.249665187 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpct" (UniqueName: "kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct") pod "network-check-target-vvtxz" (UID: "2b220af6-5884-49f2-943b-c16a83a47800") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:19.556660 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:19.556147 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:19.556660 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:19.556291 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:19.556660 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:19.556538 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:19.556852 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:19.556668 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:21.555724 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:21.555691 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:21.556161 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:21.555826 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:21.556234 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:21.556170 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:21.556284 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:21.556260 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:23.121700 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:23.121606 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:23.122159 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:23.121738 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:23.122159 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:23.121802 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs podName:14afdf01-fa2e-4563-8fbf-0cc2613b39ba nodeName:}" failed. No retries permitted until 2026-04-23 01:10:31.121782953 +0000 UTC m=+18.081636442 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs") pod "network-metrics-daemon-ps42z" (UID: "14afdf01-fa2e-4563-8fbf-0cc2613b39ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:23.323260 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:23.323226 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:23.323436 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:23.323380 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:23.323436 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:23.323406 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:23.323436 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:23.323419 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kxpct for pod openshift-network-diagnostics/network-check-target-vvtxz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:23.323596 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:23.323486 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct podName:2b220af6-5884-49f2-943b-c16a83a47800 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:31.323466687 +0000 UTC m=+18.283320146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpct" (UniqueName: "kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct") pod "network-check-target-vvtxz" (UID: "2b220af6-5884-49f2-943b-c16a83a47800") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:23.556614 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:23.556529 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:23.556753 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:23.556647 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:23.557091 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:23.557070 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:23.557182 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:23.557156 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:25.555431 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:25.555398 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:25.555825 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:25.555398 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:25.555825 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:25.555527 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:25.555825 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:25.555619 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:27.555726 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:27.555687 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:27.556195 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:27.555733 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:27.556195 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:27.555829 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:27.556195 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:27.555954 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:29.555442 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:29.555414 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:29.555862 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:29.555425 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:29.555862 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:29.555535 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:29.555862 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:29.555625 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:31.181866 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:31.181804 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:31.182357 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:31.181929 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:31.182357 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:31.182016 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs podName:14afdf01-fa2e-4563-8fbf-0cc2613b39ba nodeName:}" failed. No retries permitted until 2026-04-23 01:10:47.181992943 +0000 UTC m=+34.141846392 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs") pod "network-metrics-daemon-ps42z" (UID: "14afdf01-fa2e-4563-8fbf-0cc2613b39ba") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:31.383186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:31.383151 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:31.383374 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:31.383290 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:31.383374 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:31.383309 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:31.383374 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:31.383321 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kxpct for pod openshift-network-diagnostics/network-check-target-vvtxz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:31.383526 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:31.383380 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct podName:2b220af6-5884-49f2-943b-c16a83a47800 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:47.383363584 +0000 UTC m=+34.343217023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpct" (UniqueName: "kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct") pod "network-check-target-vvtxz" (UID: "2b220af6-5884-49f2-943b-c16a83a47800") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:31.555332 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:31.555260 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:31.555332 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:31.555294 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:31.555537 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:31.555396 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:31.555600 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:31.555530 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:32.698046 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:32.697819 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"040301b7efa681afeac2bb7c0eaf673cac78e09a195c4974509dcddbad2e9312"} Apr 23 01:10:32.699558 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:32.699529 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rgvmw" event={"ID":"09a44761-f9fa-463b-86c4-dacaca4d17a8","Type":"ContainerStarted","Data":"b50c78c494a213901a9a81674a2dbc8c8f4aaa6be1fa348b031251e98fb80efd"} Apr 23 01:10:32.701345 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:32.701163 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" event={"ID":"df6051db-2c86-4881-a85a-a58d1ac659bd","Type":"ContainerStarted","Data":"e336f243ee52f5d51a01c7de46dc2d248bbec3ad003a4591a500ea651bf3fbb3"} Apr 23 01:10:32.716700 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:32.716391 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rgvmw" podStartSLOduration=2.912650797 podStartE2EDuration="19.71637657s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.793179825 +0000 UTC m=+2.753033262" lastFinishedPulling="2026-04-23 01:10:32.596905587 +0000 UTC m=+19.556759035" observedRunningTime="2026-04-23 01:10:32.716371978 +0000 UTC m=+19.676225438" watchObservedRunningTime="2026-04-23 01:10:32.71637657 +0000 UTC m=+19.676230030" Apr 23 01:10:32.733854 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:32.733805 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6xmb8" podStartSLOduration=3.021817115 podStartE2EDuration="19.733786546s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.768127713 +0000 UTC m=+2.727981163" lastFinishedPulling="2026-04-23 01:10:32.480097155 +0000 UTC m=+19.439950594" observedRunningTime="2026-04-23 01:10:32.733547602 +0000 UTC m=+19.693401062" watchObservedRunningTime="2026-04-23 01:10:32.733786546 +0000 UTC m=+19.693640006" Apr 23 01:10:33.556079 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.556052 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:33.556185 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.556144 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:33.556235 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:33.556173 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:33.556235 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:33.556221 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:33.703428 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.703400 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rbjs6" event={"ID":"318c5767-f4ad-4937-bccb-ef0c86ed7ff7","Type":"ContainerStarted","Data":"deeba22ba7de81a0fb7768299355a081876ef46e21f42dcd18a2f84364d5b746"} Apr 23 01:10:33.704644 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.704624 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" event={"ID":"8e55bc37-b8b1-4a41-8534-2acf9f5535eb","Type":"ContainerStarted","Data":"3fcfc59d0a928bd0beb10803ea3183c2f84d07f6c926814282b3e7427556a612"} Apr 23 01:10:33.705699 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.705676 2565 generic.go:358] "Generic (PLEG): container finished" podID="54d9175d-7498-4b1f-8e42-2c7b5a37d2f4" containerID="79db59567ff68af34f6e94b8129c3f2f9e2b7d0ba4e8e833a095e06d14cce819" exitCode=0 Apr 23 01:10:33.705783 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.705751 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerDied","Data":"79db59567ff68af34f6e94b8129c3f2f9e2b7d0ba4e8e833a095e06d14cce819"} Apr 23 01:10:33.709643 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.709613 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-twhzp" event={"ID":"100ffad2-0adc-4293-8bc1-c64fdc753f08","Type":"ContainerStarted","Data":"86cae9338ba3b19d9ebd6ff6dadef7aa2dcaf85f937d0d9195b040a048b7e5d7"} Apr 23 01:10:33.710888 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.710865 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xg5xx" event={"ID":"37eabc47-849b-40f9-bac1-f73b5c3d4329","Type":"ContainerStarted","Data":"9679e52bdebfe4d6f22fc5bca0bbb2f74cd045383d717909a187ede6de49a291"} Apr 23 01:10:33.712146 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.712126 2565 generic.go:358] "Generic (PLEG): container finished" podID="9cc53ca7c179912fa71960ac8d9c32e8" containerID="3d9de72686916b086b8d0076da8d88cc36b05be667aea9cd9d869985b22ee495" exitCode=0 Apr 23 01:10:33.712220 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.712182 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" event={"ID":"9cc53ca7c179912fa71960ac8d9c32e8","Type":"ContainerDied","Data":"3d9de72686916b086b8d0076da8d88cc36b05be667aea9cd9d869985b22ee495"} Apr 23 01:10:33.718255 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.718218 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rbjs6" podStartSLOduration=12.004952907 podStartE2EDuration="20.718207461s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.783338285 +0000 UTC m=+2.743191725" lastFinishedPulling="2026-04-23 01:10:24.496592833 +0000 UTC m=+11.456446279" observedRunningTime="2026-04-23 01:10:33.717784473 +0000 UTC m=+20.677637932" watchObservedRunningTime="2026-04-23 01:10:33.718207461 +0000 UTC m=+20.678060966" Apr 23 01:10:33.718513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.718487 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"ee5b69211020cecdba24a91ec0b097d435f84fc828720dfbdc5a415ea32ddcb3"} Apr 23 01:10:33.718599 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.718518 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"d3bd44c1cd010e6c139b59f06c3123ec97306ce4e092a5a7f9ffead2bb87122d"} Apr 23 01:10:33.718599 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.718527 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"169d493b9c00f7f0bb548d88e356d49061a1f88dcdc9a47ff84f051e0faa8402"} Apr 23 01:10:33.718599 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.718539 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"383d167410d9492eb40cfba2f6c91b1643182afdfed14d5da965579ac9657772"} Apr 23 01:10:33.718599 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.718555 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"f4c8261d72cfd7ced28110a5da894a323cc7f130b254cba4673cad38f8fe9612"} Apr 23 01:10:33.769440 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.769392 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-twhzp" podStartSLOduration=4.060143525 podStartE2EDuration="20.76937868s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.768874514 +0000 UTC m=+2.728727967" lastFinishedPulling="2026-04-23 01:10:32.47810968 +0000 UTC m=+19.437963122" observedRunningTime="2026-04-23 01:10:33.76885916 +0000 UTC m=+20.728712618" watchObservedRunningTime="2026-04-23 01:10:33.76937868 +0000 UTC m=+20.729232139" Apr 23 01:10:33.782506 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:33.782447 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xg5xx" podStartSLOduration=4.14250154 podStartE2EDuration="20.782436999s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.814281269 +0000 UTC m=+2.774134712" lastFinishedPulling="2026-04-23 01:10:32.454216727 +0000 UTC m=+19.414070171" observedRunningTime="2026-04-23 01:10:33.78208247 +0000 UTC m=+20.741935929" watchObservedRunningTime="2026-04-23 01:10:33.782436999 +0000 UTC m=+20.742290457" Apr 23 01:10:34.033174 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.033118 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 01:10:34.450006 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.449918 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:34.450589 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.450559 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:34.510751 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.510660 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T01:10:34.033133016Z","UUID":"452cff60-dd45-4003-8fd3-1193da54dca7","Handler":null,"Name":"","Endpoint":""} Apr 23 01:10:34.513835 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.513817 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 01:10:34.513985 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.513843 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 01:10:34.722797 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.722711 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" event={"ID":"9cc53ca7c179912fa71960ac8d9c32e8","Type":"ContainerStarted","Data":"2ef38d679367c2ded2a40d1889150d416f26f8e7c4fc00008f97fece3432a685"} Apr 23 01:10:34.724911 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.724880 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k54d2" event={"ID":"e4fb88ad-bf54-408b-b0bc-28ff6f866ea2","Type":"ContainerStarted","Data":"e4203e687838bd00ca525c0bbc10806ee1980ccb7a94cac2a9c69dba39d24d92"} Apr 23 01:10:34.727596 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.727567 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" event={"ID":"8e55bc37-b8b1-4a41-8534-2acf9f5535eb","Type":"ContainerStarted","Data":"5723fdabada3b95016d5fc7265fb59b30f7247c7e912d1f6b67e6528453c65c4"} Apr 23 01:10:34.727901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.727883 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:34.728354 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.728335 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xg5xx" Apr 23 01:10:34.736718 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.736677 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-74.ec2.internal" podStartSLOduration=20.736663473 podStartE2EDuration="20.736663473s" podCreationTimestamp="2026-04-23 01:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:10:34.735829709 +0000 UTC m=+21.695683171" watchObservedRunningTime="2026-04-23 01:10:34.736663473 +0000 UTC m=+21.696516936" Apr 23 01:10:34.751268 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:34.749530 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-k54d2" podStartSLOduration=5.070814899 podStartE2EDuration="21.749516967s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.799305487 +0000 UTC m=+2.759158937" lastFinishedPulling="2026-04-23 01:10:32.47800755 +0000 UTC m=+19.437861005" observedRunningTime="2026-04-23 01:10:34.748713409 +0000 UTC m=+21.708566868" watchObservedRunningTime="2026-04-23 01:10:34.749516967 +0000 UTC m=+21.709370427" Apr 23 01:10:35.558170 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:35.558145 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:35.558368 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:35.558145 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:35.558368 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:35.558245 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:35.558368 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:35.558300 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:35.731134 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:35.731100 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" event={"ID":"8e55bc37-b8b1-4a41-8534-2acf9f5535eb","Type":"ContainerStarted","Data":"9690dfdbbb0c51bd81a699e88ade87db41174c5cd274eb30dc8ba2c61be37901"} Apr 23 01:10:35.734688 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:35.734607 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"ffa5347a98b9abf527117ead978e38001625690713cc28f2a4e6d49967906a3e"} Apr 23 01:10:35.748800 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:35.748755 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zwfwr" podStartSLOduration=3.398202694 podStartE2EDuration="22.748741428s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.777934769 +0000 UTC m=+2.737788221" lastFinishedPulling="2026-04-23 01:10:35.128473501 +0000 UTC m=+22.088326955" observedRunningTime="2026-04-23 01:10:35.747188608 +0000 UTC m=+22.707042067" watchObservedRunningTime="2026-04-23 01:10:35.748741428 +0000 UTC m=+22.708594887" Apr 23 01:10:37.556035 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:37.555808 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:37.556468 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:37.555808 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:37.556468 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:37.556084 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:37.556468 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:37.556172 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:38.741253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:38.741077 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerStarted","Data":"152ac4c67e8092d6e8dca1ed590d2e970995563f397504956694df9e6656fd9c"} Apr 23 01:10:38.744239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:38.744215 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" event={"ID":"6a5144da-faff-4d89-b9c9-baf899b2a716","Type":"ContainerStarted","Data":"63ecba232fbfe015d807b5c2f2f267d56fad24d31431ba983ccc275df005ad86"} Apr 23 01:10:38.744536 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:38.744522 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:38.744602 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:38.744544 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:38.758058 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:38.758037 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:38.758133 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:38.758090 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:38.784073 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:38.784036 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" podStartSLOduration=9.074246378 podStartE2EDuration="25.78402524s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.806965527 +0000 UTC m=+2.766818970" lastFinishedPulling="2026-04-23 01:10:32.516744381 +0000 UTC m=+19.476597832" observedRunningTime="2026-04-23 01:10:38.783877556 +0000 UTC m=+25.743731015" watchObservedRunningTime="2026-04-23 01:10:38.78402524 +0000 UTC m=+25.743878695" Apr 23 01:10:39.556202 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:39.556172 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:39.556455 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:39.556284 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:39.556455 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:39.556334 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:39.556455 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:39.556421 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:39.748108 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:39.748080 2565 generic.go:358] "Generic (PLEG): container finished" podID="54d9175d-7498-4b1f-8e42-2c7b5a37d2f4" containerID="152ac4c67e8092d6e8dca1ed590d2e970995563f397504956694df9e6656fd9c" exitCode=0 Apr 23 01:10:39.748465 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:39.748181 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerDied","Data":"152ac4c67e8092d6e8dca1ed590d2e970995563f397504956694df9e6656fd9c"} Apr 23 01:10:39.748465 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:39.748254 2565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 01:10:40.212559 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:40.212462 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vvtxz"] Apr 23 01:10:40.212736 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:40.212595 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:40.212736 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:40.212696 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:40.215400 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:40.215372 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ps42z"] Apr 23 01:10:40.215510 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:40.215478 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:40.215571 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:40.215552 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:40.751411 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:40.751347 2565 generic.go:358] "Generic (PLEG): container finished" podID="54d9175d-7498-4b1f-8e42-2c7b5a37d2f4" containerID="6868bbcef350b3ac3c1ed8dde6d0e16142d4206bf2e04db9b7e033db9f9a8539" exitCode=0 Apr 23 01:10:40.751411 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:40.751387 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerDied","Data":"6868bbcef350b3ac3c1ed8dde6d0e16142d4206bf2e04db9b7e033db9f9a8539"} Apr 23 01:10:40.751751 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:40.751588 2565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 01:10:41.556132 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:41.556061 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:41.556247 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:41.556194 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:41.755379 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:41.755352 2565 generic.go:358] "Generic (PLEG): container finished" podID="54d9175d-7498-4b1f-8e42-2c7b5a37d2f4" containerID="713dca8f7a6011bc7e1566c37275c622f38d2748662bfc65b0abdf54b10e72f6" exitCode=0 Apr 23 01:10:41.755760 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:41.755403 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerDied","Data":"713dca8f7a6011bc7e1566c37275c622f38d2748662bfc65b0abdf54b10e72f6"} Apr 23 01:10:42.555719 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:42.555544 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:42.555881 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:42.555818 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:42.678073 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:42.678035 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:10:43.556566 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:43.556540 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:43.556950 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:43.556631 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvtxz" podUID="2b220af6-5884-49f2-943b-c16a83a47800" Apr 23 01:10:44.555971 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:44.555941 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:44.556159 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:44.556080 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ps42z" podUID="14afdf01-fa2e-4563-8fbf-0cc2613b39ba" Apr 23 01:10:45.343780 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.343717 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-74.ec2.internal" event="NodeReady" Apr 23 01:10:45.344174 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.343827 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 01:10:45.388297 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.388268 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-czwb8"] Apr 23 01:10:45.418001 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.417962 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p8psz"] Apr 23 01:10:45.418150 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.418128 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.420935 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.420913 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 01:10:45.420935 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.420932 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 01:10:45.421139 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.420971 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4q4b\"" Apr 23 01:10:45.440990 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.440957 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-czwb8"] Apr 23 01:10:45.440990 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.440993 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p8psz"] Apr 23 01:10:45.441128 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.441107 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:45.443732 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.443709 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 01:10:45.443829 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.443745 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 01:10:45.443829 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.443710 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jjv8h\"" Apr 23 01:10:45.444063 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.444048 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 01:10:45.555487 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.555456 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:45.558782 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.558761 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hjv94\"" Apr 23 01:10:45.558915 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.558799 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 01:10:45.558915 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.558805 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 01:10:45.594316 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.594268 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.594316 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.594297 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pndxz\" (UniqueName: \"kubernetes.io/projected/2efc1102-e677-4cde-b6a8-5304536665ad-kube-api-access-pndxz\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.594456 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.594323 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2efc1102-e677-4cde-b6a8-5304536665ad-tmp-dir\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.594456 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.594408 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:45.594456 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.594442 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7sx6\" (UniqueName: \"kubernetes.io/projected/9342de38-1c25-496b-b531-420cff35d1e6-kube-api-access-c7sx6\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:45.594576 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.594469 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2efc1102-e677-4cde-b6a8-5304536665ad-config-volume\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.695668 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.695644 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2efc1102-e677-4cde-b6a8-5304536665ad-tmp-dir\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.695814 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.695703 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:45.695814 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.695726 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7sx6\" (UniqueName: \"kubernetes.io/projected/9342de38-1c25-496b-b531-420cff35d1e6-kube-api-access-c7sx6\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:45.695814 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.695756 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2efc1102-e677-4cde-b6a8-5304536665ad-config-volume\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.695814 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.695801 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.696030 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:45.695816 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:45.696030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.695822 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pndxz\" (UniqueName: \"kubernetes.io/projected/2efc1102-e677-4cde-b6a8-5304536665ad-kube-api-access-pndxz\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.696030 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:45.695885 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert podName:9342de38-1c25-496b-b531-420cff35d1e6 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:46.195866337 +0000 UTC m=+33.155719787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert") pod "ingress-canary-p8psz" (UID: "9342de38-1c25-496b-b531-420cff35d1e6") : secret "canary-serving-cert" not found Apr 23 01:10:45.696188 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.696032 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2efc1102-e677-4cde-b6a8-5304536665ad-tmp-dir\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.696188 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:45.696053 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:45.696188 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:45.696094 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls podName:2efc1102-e677-4cde-b6a8-5304536665ad nodeName:}" failed. No retries permitted until 2026-04-23 01:10:46.19608127 +0000 UTC m=+33.155934714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls") pod "dns-default-czwb8" (UID: "2efc1102-e677-4cde-b6a8-5304536665ad") : secret "dns-default-metrics-tls" not found Apr 23 01:10:45.696429 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.696409 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2efc1102-e677-4cde-b6a8-5304536665ad-config-volume\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.706158 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.706134 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pndxz\" (UniqueName: \"kubernetes.io/projected/2efc1102-e677-4cde-b6a8-5304536665ad-kube-api-access-pndxz\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:45.706277 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:45.706174 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7sx6\" (UniqueName: \"kubernetes.io/projected/9342de38-1c25-496b-b531-420cff35d1e6-kube-api-access-c7sx6\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:46.199065 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:46.199031 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:46.199334 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:46.199093 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:46.199334 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:46.199195 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:46.199334 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:46.199201 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:46.199334 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:46.199268 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls podName:2efc1102-e677-4cde-b6a8-5304536665ad nodeName:}" failed. No retries permitted until 2026-04-23 01:10:47.199252479 +0000 UTC m=+34.159105917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls") pod "dns-default-czwb8" (UID: "2efc1102-e677-4cde-b6a8-5304536665ad") : secret "dns-default-metrics-tls" not found Apr 23 01:10:46.199334 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:46.199283 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert podName:9342de38-1c25-496b-b531-420cff35d1e6 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:47.199277408 +0000 UTC m=+34.159130845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert") pod "ingress-canary-p8psz" (UID: "9342de38-1c25-496b-b531-420cff35d1e6") : secret "canary-serving-cert" not found Apr 23 01:10:46.555734 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:46.555643 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:46.559244 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:46.559212 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj44d\"" Apr 23 01:10:46.559244 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:46.559241 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 01:10:47.205444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:47.205408 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:10:47.205638 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:47.205460 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:47.205638 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:47.205594 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 01:10:47.205747 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:47.205679 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs podName:14afdf01-fa2e-4563-8fbf-0cc2613b39ba nodeName:}" failed. No retries permitted until 2026-04-23 01:11:19.205659016 +0000 UTC m=+66.165512454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs") pod "network-metrics-daemon-ps42z" (UID: "14afdf01-fa2e-4563-8fbf-0cc2613b39ba") : secret "metrics-daemon-secret" not found Apr 23 01:10:47.205747 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:47.205686 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:47.205747 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:47.205599 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:47.205747 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:47.205686 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:47.205747 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:47.205742 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls podName:2efc1102-e677-4cde-b6a8-5304536665ad nodeName:}" failed. No retries permitted until 2026-04-23 01:10:49.205724711 +0000 UTC m=+36.165578148 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls") pod "dns-default-czwb8" (UID: "2efc1102-e677-4cde-b6a8-5304536665ad") : secret "dns-default-metrics-tls" not found Apr 23 01:10:47.205962 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:47.205763 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert podName:9342de38-1c25-496b-b531-420cff35d1e6 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:49.205750762 +0000 UTC m=+36.165604216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert") pod "ingress-canary-p8psz" (UID: "9342de38-1c25-496b-b531-420cff35d1e6") : secret "canary-serving-cert" not found Apr 23 01:10:47.406861 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:47.406831 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:47.409092 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:47.409074 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxpct\" (UniqueName: \"kubernetes.io/projected/2b220af6-5884-49f2-943b-c16a83a47800-kube-api-access-kxpct\") pod \"network-check-target-vvtxz\" (UID: \"2b220af6-5884-49f2-943b-c16a83a47800\") " pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:47.666036 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:47.666016 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:47.980648 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:47.980465 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vvtxz"] Apr 23 01:10:47.984672 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:10:47.984647 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b220af6_5884_49f2_943b_c16a83a47800.slice/crio-c34414dfd836e8fc2bfac1752162b7466d09ef32aa3ff4ec8b96a22d78327cdd WatchSource:0}: Error finding container c34414dfd836e8fc2bfac1752162b7466d09ef32aa3ff4ec8b96a22d78327cdd: Status 404 returned error can't find the container with id c34414dfd836e8fc2bfac1752162b7466d09ef32aa3ff4ec8b96a22d78327cdd Apr 23 01:10:48.771250 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:48.771171 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vvtxz" event={"ID":"2b220af6-5884-49f2-943b-c16a83a47800","Type":"ContainerStarted","Data":"c34414dfd836e8fc2bfac1752162b7466d09ef32aa3ff4ec8b96a22d78327cdd"} Apr 23 01:10:48.773945 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:48.773917 2565 generic.go:358] "Generic (PLEG): container finished" podID="54d9175d-7498-4b1f-8e42-2c7b5a37d2f4" containerID="8c28f5724fa7090b7fabfb71a7d0ad5fa222882afe71eb1dc16202a51a4276ae" exitCode=0 Apr 23 01:10:48.774083 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:48.773964 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerDied","Data":"8c28f5724fa7090b7fabfb71a7d0ad5fa222882afe71eb1dc16202a51a4276ae"} Apr 23 01:10:49.219347 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:49.219314 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:49.219347 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:49.219359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:49.219593 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:49.219471 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:49.219593 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:49.219474 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:49.219593 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:49.219537 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls podName:2efc1102-e677-4cde-b6a8-5304536665ad nodeName:}" failed. No retries permitted until 2026-04-23 01:10:53.219517596 +0000 UTC m=+40.179371037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls") pod "dns-default-czwb8" (UID: "2efc1102-e677-4cde-b6a8-5304536665ad") : secret "dns-default-metrics-tls" not found Apr 23 01:10:49.219593 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:49.219554 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert podName:9342de38-1c25-496b-b531-420cff35d1e6 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:53.219546081 +0000 UTC m=+40.179399520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert") pod "ingress-canary-p8psz" (UID: "9342de38-1c25-496b-b531-420cff35d1e6") : secret "canary-serving-cert" not found Apr 23 01:10:49.778091 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:49.778049 2565 generic.go:358] "Generic (PLEG): container finished" podID="54d9175d-7498-4b1f-8e42-2c7b5a37d2f4" containerID="1feddffd6c80e7b381524743e4f134e687711879cb9e36e078b4c4bade110ab1" exitCode=0 Apr 23 01:10:49.778091 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:49.778083 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerDied","Data":"1feddffd6c80e7b381524743e4f134e687711879cb9e36e078b4c4bade110ab1"} Apr 23 01:10:51.782283 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:51.782251 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vvtxz" event={"ID":"2b220af6-5884-49f2-943b-c16a83a47800","Type":"ContainerStarted","Data":"83cd447f6c1c4f30c7d635deee7b34b59c824b8bd2c0b5b0e96d82c330262f2f"} Apr 23 01:10:51.782660 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:51.782362 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:10:51.784885 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:51.784862 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" event={"ID":"54d9175d-7498-4b1f-8e42-2c7b5a37d2f4","Type":"ContainerStarted","Data":"ad692d54a0075b63b0833d89d3a81c4d4dcec1199a019b960f36500e2bd259d4"} Apr 23 01:10:51.797764 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:51.797722 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vvtxz" podStartSLOduration=35.656279896 podStartE2EDuration="38.797707699s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:47.988730034 +0000 UTC m=+34.948583471" lastFinishedPulling="2026-04-23 01:10:51.130157834 +0000 UTC m=+38.090011274" observedRunningTime="2026-04-23 01:10:51.797147202 +0000 UTC m=+38.757000661" watchObservedRunningTime="2026-04-23 01:10:51.797707699 +0000 UTC m=+38.757561158" Apr 23 01:10:51.818039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:51.817999 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g4vqs" podStartSLOduration=6.760320816 podStartE2EDuration="38.817971963s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:10:15.776250912 +0000 UTC m=+2.736104350" lastFinishedPulling="2026-04-23 01:10:47.833902045 +0000 UTC m=+34.793755497" observedRunningTime="2026-04-23 01:10:51.817527749 +0000 UTC m=+38.777381208" watchObservedRunningTime="2026-04-23 01:10:51.817971963 +0000 UTC m=+38.777825421" Apr 23 01:10:53.245904 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:53.245875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:10:53.246248 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:10:53.245924 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:10:53.246248 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:53.246032 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:53.246248 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:53.246080 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert podName:9342de38-1c25-496b-b531-420cff35d1e6 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:01.246067119 +0000 UTC m=+48.205920556 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert") pod "ingress-canary-p8psz" (UID: "9342de38-1c25-496b-b531-420cff35d1e6") : secret "canary-serving-cert" not found Apr 23 01:10:53.246248 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:53.246032 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:53.246248 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:10:53.246148 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls podName:2efc1102-e677-4cde-b6a8-5304536665ad nodeName:}" failed. No retries permitted until 2026-04-23 01:11:01.246136602 +0000 UTC m=+48.205990053 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls") pod "dns-default-czwb8" (UID: "2efc1102-e677-4cde-b6a8-5304536665ad") : secret "dns-default-metrics-tls" not found Apr 23 01:11:01.297894 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:01.297861 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:11:01.298331 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:01.297906 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:11:01.298331 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:01.298014 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:11:01.298331 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:01.298025 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:11:01.298331 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:01.298068 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls podName:2efc1102-e677-4cde-b6a8-5304536665ad nodeName:}" failed. No retries permitted until 2026-04-23 01:11:17.298051771 +0000 UTC m=+64.257905210 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls") pod "dns-default-czwb8" (UID: "2efc1102-e677-4cde-b6a8-5304536665ad") : secret "dns-default-metrics-tls" not found Apr 23 01:11:01.298331 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:01.298082 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert podName:9342de38-1c25-496b-b531-420cff35d1e6 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:17.298076769 +0000 UTC m=+64.257930207 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert") pod "ingress-canary-p8psz" (UID: "9342de38-1c25-496b-b531-420cff35d1e6") : secret "canary-serving-cert" not found Apr 23 01:11:04.301068 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.301038 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c"] Apr 23 01:11:04.324354 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.324329 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c"] Apr 23 01:11:04.324354 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.324359 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g"] Apr 23 01:11:04.324574 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.324463 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" Apr 23 01:11:04.328961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.328938 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-j2fw8\"" Apr 23 01:11:04.329116 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.328989 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 01:11:04.329116 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.329009 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 01:11:04.330626 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.330292 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 01:11:04.330626 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.330293 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 01:11:04.342846 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.342822 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g"] Apr 23 01:11:04.342944 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.342921 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.346264 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.346248 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 01:11:04.517215 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.517188 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-tmp\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.517352 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.517220 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntv6f\" (UniqueName: \"kubernetes.io/projected/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-kube-api-access-ntv6f\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.517352 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.517262 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-klusterlet-config\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.517352 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.517310 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1b13cab5-08ee-4342-a9e2-2d2eb3021a4e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75c9c8975c-mph4c\" (UID: \"1b13cab5-08ee-4342-a9e2-2d2eb3021a4e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" Apr 23 01:11:04.517352 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.517349 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjpgz\" (UniqueName: \"kubernetes.io/projected/1b13cab5-08ee-4342-a9e2-2d2eb3021a4e-kube-api-access-gjpgz\") pod \"managed-serviceaccount-addon-agent-75c9c8975c-mph4c\" (UID: \"1b13cab5-08ee-4342-a9e2-2d2eb3021a4e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" Apr 23 01:11:04.618072 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.618001 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-klusterlet-config\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.618072 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.618041 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1b13cab5-08ee-4342-a9e2-2d2eb3021a4e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75c9c8975c-mph4c\" (UID: \"1b13cab5-08ee-4342-a9e2-2d2eb3021a4e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" Apr 23 01:11:04.618252 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.618155 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjpgz\" (UniqueName: \"kubernetes.io/projected/1b13cab5-08ee-4342-a9e2-2d2eb3021a4e-kube-api-access-gjpgz\") pod \"managed-serviceaccount-addon-agent-75c9c8975c-mph4c\" (UID: \"1b13cab5-08ee-4342-a9e2-2d2eb3021a4e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" Apr 23 01:11:04.618252 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.618190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-tmp\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.618252 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.618215 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntv6f\" (UniqueName: \"kubernetes.io/projected/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-kube-api-access-ntv6f\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.618678 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.618657 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-tmp\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.621832 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.621804 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-klusterlet-config\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.622100 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.622079 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1b13cab5-08ee-4342-a9e2-2d2eb3021a4e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75c9c8975c-mph4c\" (UID: \"1b13cab5-08ee-4342-a9e2-2d2eb3021a4e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" Apr 23 01:11:04.625993 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.625959 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjpgz\" (UniqueName: \"kubernetes.io/projected/1b13cab5-08ee-4342-a9e2-2d2eb3021a4e-kube-api-access-gjpgz\") pod \"managed-serviceaccount-addon-agent-75c9c8975c-mph4c\" (UID: \"1b13cab5-08ee-4342-a9e2-2d2eb3021a4e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" Apr 23 01:11:04.626283 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.626263 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntv6f\" (UniqueName: \"kubernetes.io/projected/bd4013b5-2b2e-4b56-b0fc-bd99c044a509-kube-api-access-ntv6f\") pod \"klusterlet-addon-workmgr-bc5d45476-v764g\" (UID: \"bd4013b5-2b2e-4b56-b0fc-bd99c044a509\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.651055 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.651036 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:04.653592 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.653571 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" Apr 23 01:11:04.772417 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.772390 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g"] Apr 23 01:11:04.775713 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:11:04.775677 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4013b5_2b2e_4b56_b0fc_bd99c044a509.slice/crio-b6669113885558bea71f861e9e9fc5144dee5da01db4f3cf25081c04ef7e710c WatchSource:0}: Error finding container b6669113885558bea71f861e9e9fc5144dee5da01db4f3cf25081c04ef7e710c: Status 404 returned error can't find the container with id b6669113885558bea71f861e9e9fc5144dee5da01db4f3cf25081c04ef7e710c Apr 23 01:11:04.787154 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.787132 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c"] Apr 23 01:11:04.790462 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:11:04.790441 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b13cab5_08ee_4342_a9e2_2d2eb3021a4e.slice/crio-c4973048a5bfc21e9a8a3d79f2f53c610aada320b9edceed0066506f255dec5a WatchSource:0}: Error finding container c4973048a5bfc21e9a8a3d79f2f53c610aada320b9edceed0066506f255dec5a: Status 404 returned error can't find the container with id c4973048a5bfc21e9a8a3d79f2f53c610aada320b9edceed0066506f255dec5a Apr 23 01:11:04.807030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.806999 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" event={"ID":"1b13cab5-08ee-4342-a9e2-2d2eb3021a4e","Type":"ContainerStarted","Data":"c4973048a5bfc21e9a8a3d79f2f53c610aada320b9edceed0066506f255dec5a"} Apr 23 01:11:04.807814 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:04.807793 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" event={"ID":"bd4013b5-2b2e-4b56-b0fc-bd99c044a509","Type":"ContainerStarted","Data":"b6669113885558bea71f861e9e9fc5144dee5da01db4f3cf25081c04ef7e710c"} Apr 23 01:11:08.817261 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:08.817224 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" event={"ID":"1b13cab5-08ee-4342-a9e2-2d2eb3021a4e","Type":"ContainerStarted","Data":"b6fdbd7805750c9ef208f47c96cd927b7292ea7063ce92dfdda8be1a2ae544d7"} Apr 23 01:11:09.820200 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:09.820164 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" event={"ID":"bd4013b5-2b2e-4b56-b0fc-bd99c044a509","Type":"ContainerStarted","Data":"9fd98ffe95d6fa55cdc4d8205b117d74529a85f93a957c3afe4b2cb5acb3f334"} Apr 23 01:11:09.835461 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:09.835418 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c9c8975c-mph4c" podStartSLOduration=2.65550677 podStartE2EDuration="5.835405222s" podCreationTimestamp="2026-04-23 01:11:04 +0000 UTC" firstStartedPulling="2026-04-23 01:11:04.792235753 +0000 UTC m=+51.752089191" lastFinishedPulling="2026-04-23 01:11:07.9721342 +0000 UTC m=+54.931987643" observedRunningTime="2026-04-23 01:11:08.831961007 +0000 UTC m=+55.791814467" watchObservedRunningTime="2026-04-23 01:11:09.835405222 +0000 UTC m=+56.795258683" Apr 23 01:11:09.835576 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:09.835485 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" podStartSLOduration=0.986475729 podStartE2EDuration="5.835481826s" podCreationTimestamp="2026-04-23 01:11:04 +0000 UTC" firstStartedPulling="2026-04-23 01:11:04.777604957 +0000 UTC m=+51.737458394" lastFinishedPulling="2026-04-23 01:11:09.626611054 +0000 UTC m=+56.586464491" observedRunningTime="2026-04-23 01:11:09.835026094 +0000 UTC m=+56.794879554" watchObservedRunningTime="2026-04-23 01:11:09.835481826 +0000 UTC m=+56.795335285" Apr 23 01:11:10.822464 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:10.822427 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:10.823952 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:10.823930 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc5d45476-v764g" Apr 23 01:11:12.688624 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:12.688599 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xt5wx" Apr 23 01:11:17.300823 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:17.300787 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:11:17.301230 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:17.300842 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:11:17.301230 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:17.300923 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:11:17.301230 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:17.300932 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:11:17.301230 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:17.301013 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert podName:9342de38-1c25-496b-b531-420cff35d1e6 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:49.300998265 +0000 UTC m=+96.260851703 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert") pod "ingress-canary-p8psz" (UID: "9342de38-1c25-496b-b531-420cff35d1e6") : secret "canary-serving-cert" not found Apr 23 01:11:17.301230 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:17.301069 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls podName:2efc1102-e677-4cde-b6a8-5304536665ad nodeName:}" failed. No retries permitted until 2026-04-23 01:11:49.301048419 +0000 UTC m=+96.260901857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls") pod "dns-default-czwb8" (UID: "2efc1102-e677-4cde-b6a8-5304536665ad") : secret "dns-default-metrics-tls" not found Apr 23 01:11:19.213085 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:19.213045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:11:19.213462 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:19.213186 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 01:11:19.213462 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:19.213250 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs podName:14afdf01-fa2e-4563-8fbf-0cc2613b39ba nodeName:}" failed. No retries permitted until 2026-04-23 01:12:23.213234549 +0000 UTC m=+130.173087988 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs") pod "network-metrics-daemon-ps42z" (UID: "14afdf01-fa2e-4563-8fbf-0cc2613b39ba") : secret "metrics-daemon-secret" not found Apr 23 01:11:22.788829 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:22.788798 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vvtxz" Apr 23 01:11:41.487684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.487550 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-78df469446-xzwmg"] Apr 23 01:11:41.494482 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.494463 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.496994 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.496959 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-7tmdl\"" Apr 23 01:11:41.497130 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.497049 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 01:11:41.497276 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.497262 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 01:11:41.497317 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.497289 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 01:11:41.497372 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.497267 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 01:11:41.498258 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.498232 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 01:11:41.498366 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.498289 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 01:11:41.501419 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.501395 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-78df469446-xzwmg"] Apr 23 01:11:41.567267 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.567244 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.567370 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.567273 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-default-certificate\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.567370 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.567341 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-stats-auth\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.567439 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.567368 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vshqw\" (UniqueName: \"kubernetes.io/projected/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-kube-api-access-vshqw\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.567439 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.567397 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.668105 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.668075 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.668186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.668113 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-default-certificate\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.668186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.668163 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-stats-auth\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.668186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.668181 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vshqw\" (UniqueName: \"kubernetes.io/projected/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-kube-api-access-vshqw\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.668298 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.668201 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.668298 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:41.668257 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:42.168236159 +0000 UTC m=+89.128089602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : configmap references non-existent config key: service-ca.crt Apr 23 01:11:41.668298 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:41.668267 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 01:11:41.668484 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:41.668318 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:42.168306141 +0000 UTC m=+89.128159579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : secret "router-metrics-certs-default" not found Apr 23 01:11:41.670534 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.670512 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-stats-auth\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.670689 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.670573 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-default-certificate\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:41.676782 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:41.676759 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vshqw\" (UniqueName: \"kubernetes.io/projected/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-kube-api-access-vshqw\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:42.171280 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:42.171236 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:42.171455 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:42.171300 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:42.171455 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:42.171392 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 01:11:42.171455 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:42.171412 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:43.171396132 +0000 UTC m=+90.131249574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : configmap references non-existent config key: service-ca.crt Apr 23 01:11:42.171455 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:42.171456 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:43.171441943 +0000 UTC m=+90.131295381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : secret "router-metrics-certs-default" not found Apr 23 01:11:43.178400 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:43.178364 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:43.178830 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:43.178410 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:43.178830 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:43.178545 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:45.178530378 +0000 UTC m=+92.138383816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : configmap references non-existent config key: service-ca.crt Apr 23 01:11:43.178830 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:43.178561 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 01:11:43.178830 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:43.178628 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:45.178609654 +0000 UTC m=+92.138463106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : secret "router-metrics-certs-default" not found Apr 23 01:11:45.194055 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:45.194018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:45.194526 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:45.194067 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:45.194526 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:45.194156 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 01:11:45.194526 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:45.194169 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:49.194156026 +0000 UTC m=+96.154009465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : configmap references non-existent config key: service-ca.crt Apr 23 01:11:45.194526 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:45.194206 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:49.194193457 +0000 UTC m=+96.154046899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : secret "router-metrics-certs-default" not found Apr 23 01:11:48.181569 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:48.181542 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-twhzp_100ffad2-0adc-4293-8bc1-c64fdc753f08/dns-node-resolver/0.log" Apr 23 01:11:48.981546 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:48.981513 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rbjs6_318c5767-f4ad-4937-bccb-ef0c86ed7ff7/node-ca/0.log" Apr 23 01:11:49.222305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:49.222271 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:49.222702 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:49.222317 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:49.222702 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:49.222433 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:57.222420862 +0000 UTC m=+104.182274300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : configmap references non-existent config key: service-ca.crt Apr 23 01:11:49.222702 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:49.222434 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 01:11:49.222702 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:49.222494 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:57.222476232 +0000 UTC m=+104.182329684 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : secret "router-metrics-certs-default" not found Apr 23 01:11:49.323320 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:49.323251 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:11:49.323320 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:49.323308 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:11:49.323448 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:49.323380 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:11:49.323448 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:49.323389 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:11:49.323448 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:49.323439 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert podName:9342de38-1c25-496b-b531-420cff35d1e6 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:53.323422922 +0000 UTC m=+160.283276360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert") pod "ingress-canary-p8psz" (UID: "9342de38-1c25-496b-b531-420cff35d1e6") : secret "canary-serving-cert" not found Apr 23 01:11:49.323541 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:49.323453 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls podName:2efc1102-e677-4cde-b6a8-5304536665ad nodeName:}" failed. No retries permitted until 2026-04-23 01:12:53.323447551 +0000 UTC m=+160.283300989 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls") pod "dns-default-czwb8" (UID: "2efc1102-e677-4cde-b6a8-5304536665ad") : secret "dns-default-metrics-tls" not found Apr 23 01:11:51.626777 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.626741 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5"] Apr 23 01:11:51.629565 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.629550 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.631944 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.631923 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 01:11:51.632129 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.632095 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jjs4f\"" Apr 23 01:11:51.633268 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.633252 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 01:11:51.633268 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.633266 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:11:51.633393 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.633284 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 01:11:51.638119 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.638094 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5"] Apr 23 01:11:51.738862 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.738826 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-serving-cert\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.739089 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.738912 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn5gv\" (UniqueName: \"kubernetes.io/projected/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-kube-api-access-xn5gv\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.739089 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.738952 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-config\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.839351 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.839306 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn5gv\" (UniqueName: \"kubernetes.io/projected/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-kube-api-access-xn5gv\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.839460 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.839374 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-config\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.839460 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.839394 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-serving-cert\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.839929 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.839905 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-config\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.841533 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.841517 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-serving-cert\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.847022 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.847004 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn5gv\" (UniqueName: \"kubernetes.io/projected/8cfda3ef-b3c6-4cb0-8e04-57fefbef4404-kube-api-access-xn5gv\") pod \"service-ca-operator-d6fc45fc5-b9dl5\" (UID: \"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:51.939468 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:51.939444 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" Apr 23 01:11:52.047899 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:52.047869 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5"] Apr 23 01:11:52.051069 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:11:52.051038 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cfda3ef_b3c6_4cb0_8e04_57fefbef4404.slice/crio-7bdda80da1da3a696dc9c0e8dc53cce1505b2c8263e21405fbb55693472947a5 WatchSource:0}: Error finding container 7bdda80da1da3a696dc9c0e8dc53cce1505b2c8263e21405fbb55693472947a5: Status 404 returned error can't find the container with id 7bdda80da1da3a696dc9c0e8dc53cce1505b2c8263e21405fbb55693472947a5 Apr 23 01:11:52.899871 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:52.899828 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" event={"ID":"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404","Type":"ContainerStarted","Data":"7bdda80da1da3a696dc9c0e8dc53cce1505b2c8263e21405fbb55693472947a5"} Apr 23 01:11:54.905088 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:54.905052 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" event={"ID":"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404","Type":"ContainerStarted","Data":"e25ff28339a9f727224c90b70b40dae141fddef7afb073f32b0ab272806b2f98"} Apr 23 01:11:54.920229 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:54.920188 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" podStartSLOduration=1.234001108 podStartE2EDuration="3.92017438s" podCreationTimestamp="2026-04-23 01:11:51 +0000 UTC" firstStartedPulling="2026-04-23 01:11:52.053172608 +0000 UTC m=+99.013026047" lastFinishedPulling="2026-04-23 01:11:54.739345877 +0000 UTC m=+101.699199319" observedRunningTime="2026-04-23 01:11:54.918513088 +0000 UTC m=+101.878366550" watchObservedRunningTime="2026-04-23 01:11:54.92017438 +0000 UTC m=+101.880027831" Apr 23 01:11:56.870443 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:56.870406 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz"] Apr 23 01:11:56.873511 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:56.873496 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz" Apr 23 01:11:56.875824 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:56.875806 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-gw9m7\"" Apr 23 01:11:56.878967 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:56.878945 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz"] Apr 23 01:11:56.981250 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:56.981204 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vghtf\" (UniqueName: \"kubernetes.io/projected/69187cbc-d5af-458a-8685-2d0d6a58e4e9-kube-api-access-vghtf\") pod \"network-check-source-8894fc9bd-d2wkz\" (UID: \"69187cbc-d5af-458a-8685-2d0d6a58e4e9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz" Apr 23 01:11:57.082619 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.082586 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vghtf\" (UniqueName: \"kubernetes.io/projected/69187cbc-d5af-458a-8685-2d0d6a58e4e9-kube-api-access-vghtf\") pod \"network-check-source-8894fc9bd-d2wkz\" (UID: \"69187cbc-d5af-458a-8685-2d0d6a58e4e9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz" Apr 23 01:11:57.096255 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.096224 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vghtf\" (UniqueName: \"kubernetes.io/projected/69187cbc-d5af-458a-8685-2d0d6a58e4e9-kube-api-access-vghtf\") pod \"network-check-source-8894fc9bd-d2wkz\" (UID: \"69187cbc-d5af-458a-8685-2d0d6a58e4e9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz" Apr 23 01:11:57.182537 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.182503 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz" Apr 23 01:11:57.283402 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.283370 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:57.283581 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.283418 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:11:57.283581 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:57.283481 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 01:11:57.283581 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:57.283528 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:13.283514904 +0000 UTC m=+120.243368341 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : configmap references non-existent config key: service-ca.crt Apr 23 01:11:57.283581 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:11:57.283543 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs podName:cf49adfe-57aa-40fc-9a7b-4156e87b2ad7 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:13.283536924 +0000 UTC m=+120.243390361 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs") pod "router-default-78df469446-xzwmg" (UID: "cf49adfe-57aa-40fc-9a7b-4156e87b2ad7") : secret "router-metrics-certs-default" not found Apr 23 01:11:57.294463 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.294433 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz"] Apr 23 01:11:57.297543 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:11:57.297517 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69187cbc_d5af_458a_8685_2d0d6a58e4e9.slice/crio-9f17130c907e51d057d0668073d157da74ce6bd7cf453ae6b86a73c199f0fe94 WatchSource:0}: Error finding container 9f17130c907e51d057d0668073d157da74ce6bd7cf453ae6b86a73c199f0fe94: Status 404 returned error can't find the container with id 9f17130c907e51d057d0668073d157da74ce6bd7cf453ae6b86a73c199f0fe94 Apr 23 01:11:57.916005 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.915955 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz" event={"ID":"69187cbc-d5af-458a-8685-2d0d6a58e4e9","Type":"ContainerStarted","Data":"23d04e38b32d895c1193c6e89196eba514945b4b3486b04c89abf7a93a65ec89"} Apr 23 01:11:57.916441 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.916012 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz" event={"ID":"69187cbc-d5af-458a-8685-2d0d6a58e4e9","Type":"ContainerStarted","Data":"9f17130c907e51d057d0668073d157da74ce6bd7cf453ae6b86a73c199f0fe94"} Apr 23 01:11:57.930742 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:11:57.930701 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-d2wkz" podStartSLOduration=1.930690329 podStartE2EDuration="1.930690329s" podCreationTimestamp="2026-04-23 01:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:11:57.92975992 +0000 UTC m=+104.889613379" watchObservedRunningTime="2026-04-23 01:11:57.930690329 +0000 UTC m=+104.890543789" Apr 23 01:12:13.299796 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.299760 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:13.300184 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.299808 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:13.300411 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.300392 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-service-ca-bundle\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:13.302017 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.301997 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf49adfe-57aa-40fc-9a7b-4156e87b2ad7-metrics-certs\") pod \"router-default-78df469446-xzwmg\" (UID: \"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7\") " pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:13.304903 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.304882 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:13.416676 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.416604 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-78df469446-xzwmg"] Apr 23 01:12:13.419106 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:12:13.419079 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf49adfe_57aa_40fc_9a7b_4156e87b2ad7.slice/crio-7aace18a409bb13ec90eb60c8bf81bfa6a3d18636d93fbeb73d1e8acc89621a3 WatchSource:0}: Error finding container 7aace18a409bb13ec90eb60c8bf81bfa6a3d18636d93fbeb73d1e8acc89621a3: Status 404 returned error can't find the container with id 7aace18a409bb13ec90eb60c8bf81bfa6a3d18636d93fbeb73d1e8acc89621a3 Apr 23 01:12:13.948253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.948219 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-78df469446-xzwmg" event={"ID":"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7","Type":"ContainerStarted","Data":"55f776fb129c6d775f9dcfc6da5321af825b0564cdfd2998dc919c6d64e5a5e5"} Apr 23 01:12:13.948253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.948253 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-78df469446-xzwmg" event={"ID":"cf49adfe-57aa-40fc-9a7b-4156e87b2ad7","Type":"ContainerStarted","Data":"7aace18a409bb13ec90eb60c8bf81bfa6a3d18636d93fbeb73d1e8acc89621a3"} Apr 23 01:12:13.965342 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:13.965302 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-78df469446-xzwmg" podStartSLOduration=32.965288545 podStartE2EDuration="32.965288545s" podCreationTimestamp="2026-04-23 01:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:12:13.964596152 +0000 UTC m=+120.924449612" watchObservedRunningTime="2026-04-23 01:12:13.965288545 +0000 UTC m=+120.925142005" Apr 23 01:12:14.305913 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:14.305844 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:14.308536 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:14.308511 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:14.951321 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:14.951289 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:14.952455 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:14.952436 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-78df469446-xzwmg" Apr 23 01:12:20.380520 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.380481 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6q2sj"] Apr 23 01:12:20.384783 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.384762 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.388872 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.388842 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 01:12:20.389052 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.388842 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8pwvt\"" Apr 23 01:12:20.389052 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.388925 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 01:12:20.389052 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.389004 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 01:12:20.389214 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.389198 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 01:12:20.397401 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.397376 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6q2sj"] Apr 23 01:12:20.488351 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.488316 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86f775b4f-hdvw2"] Apr 23 01:12:20.506849 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.506829 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.510756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.510714 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86f775b4f-hdvw2"] Apr 23 01:12:20.514094 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.514066 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 01:12:20.514270 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.514250 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ndfmc\"" Apr 23 01:12:20.514362 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.514301 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 01:12:20.514362 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.514082 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 01:12:20.520015 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.519996 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 01:12:20.548491 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.548469 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljvmv\" (UniqueName: \"kubernetes.io/projected/dc212b92-93e6-442a-ba12-b470f57b4965-kube-api-access-ljvmv\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.548575 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.548499 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc212b92-93e6-442a-ba12-b470f57b4965-crio-socket\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.548575 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.548522 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc212b92-93e6-442a-ba12-b470f57b4965-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.548676 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.548633 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc212b92-93e6-442a-ba12-b470f57b4965-data-volume\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.548676 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.548666 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc212b92-93e6-442a-ba12-b470f57b4965-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.649413 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649392 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljvmv\" (UniqueName: \"kubernetes.io/projected/dc212b92-93e6-442a-ba12-b470f57b4965-kube-api-access-ljvmv\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.649509 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649418 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-bound-sa-token\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.649509 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649436 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc212b92-93e6-442a-ba12-b470f57b4965-crio-socket\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.649509 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649458 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc212b92-93e6-442a-ba12-b470f57b4965-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.649655 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649513 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73635124-a810-416a-8482-00ba10f2ad6e-image-registry-private-configuration\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.649655 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649545 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7dg\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-kube-api-access-4b7dg\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.649655 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649574 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73635124-a810-416a-8482-00ba10f2ad6e-ca-trust-extracted\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.649655 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649593 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dc212b92-93e6-442a-ba12-b470f57b4965-crio-socket\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.649805 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649684 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73635124-a810-416a-8482-00ba10f2ad6e-registry-certificates\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.649805 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649715 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73635124-a810-416a-8482-00ba10f2ad6e-trusted-ca\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.649805 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649737 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc212b92-93e6-442a-ba12-b470f57b4965-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.649904 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649823 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc212b92-93e6-442a-ba12-b470f57b4965-data-volume\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.649904 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649857 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73635124-a810-416a-8482-00ba10f2ad6e-installation-pull-secrets\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.649904 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.649893 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-registry-tls\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.650206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.650186 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dc212b92-93e6-442a-ba12-b470f57b4965-data-volume\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.650272 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.650254 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dc212b92-93e6-442a-ba12-b470f57b4965-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.651840 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.651820 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dc212b92-93e6-442a-ba12-b470f57b4965-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.657540 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.657523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljvmv\" (UniqueName: \"kubernetes.io/projected/dc212b92-93e6-442a-ba12-b470f57b4965-kube-api-access-ljvmv\") pod \"insights-runtime-extractor-6q2sj\" (UID: \"dc212b92-93e6-442a-ba12-b470f57b4965\") " pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.696538 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.696517 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6q2sj" Apr 23 01:12:20.750253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.750225 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73635124-a810-416a-8482-00ba10f2ad6e-installation-pull-secrets\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.750395 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.750265 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-registry-tls\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.750395 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.750289 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-bound-sa-token\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.750395 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.750337 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73635124-a810-416a-8482-00ba10f2ad6e-image-registry-private-configuration\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.750395 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.750363 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7dg\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-kube-api-access-4b7dg\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.750395 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.750381 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73635124-a810-416a-8482-00ba10f2ad6e-ca-trust-extracted\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.750657 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.750412 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73635124-a810-416a-8482-00ba10f2ad6e-registry-certificates\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.750657 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.750443 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73635124-a810-416a-8482-00ba10f2ad6e-trusted-ca\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.751676 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.751490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73635124-a810-416a-8482-00ba10f2ad6e-ca-trust-extracted\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.751963 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.751938 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73635124-a810-416a-8482-00ba10f2ad6e-trusted-ca\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.752248 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.752190 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73635124-a810-416a-8482-00ba10f2ad6e-registry-certificates\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.755885 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.755841 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73635124-a810-416a-8482-00ba10f2ad6e-installation-pull-secrets\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.756850 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.756805 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73635124-a810-416a-8482-00ba10f2ad6e-image-registry-private-configuration\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.757487 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.757450 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-registry-tls\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.761020 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.760959 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-bound-sa-token\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.761132 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.761079 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7dg\" (UniqueName: \"kubernetes.io/projected/73635124-a810-416a-8482-00ba10f2ad6e-kube-api-access-4b7dg\") pod \"image-registry-86f775b4f-hdvw2\" (UID: \"73635124-a810-416a-8482-00ba10f2ad6e\") " pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.809722 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.809698 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6q2sj"] Apr 23 01:12:20.812463 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:12:20.812440 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc212b92_93e6_442a_ba12_b470f57b4965.slice/crio-d906b74deb35af345634d12af185e431f34b7760e92cd129aa5b563bd8971bfb WatchSource:0}: Error finding container d906b74deb35af345634d12af185e431f34b7760e92cd129aa5b563bd8971bfb: Status 404 returned error can't find the container with id d906b74deb35af345634d12af185e431f34b7760e92cd129aa5b563bd8971bfb Apr 23 01:12:20.822965 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.822943 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:20.939769 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.939699 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86f775b4f-hdvw2"] Apr 23 01:12:20.942409 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:12:20.942382 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73635124_a810_416a_8482_00ba10f2ad6e.slice/crio-353ca782de05ec4bdfcb6035860b4040e4a6db2f81dd9b306369eaf67c4aa1ca WatchSource:0}: Error finding container 353ca782de05ec4bdfcb6035860b4040e4a6db2f81dd9b306369eaf67c4aa1ca: Status 404 returned error can't find the container with id 353ca782de05ec4bdfcb6035860b4040e4a6db2f81dd9b306369eaf67c4aa1ca Apr 23 01:12:20.969503 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.969478 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" event={"ID":"73635124-a810-416a-8482-00ba10f2ad6e","Type":"ContainerStarted","Data":"353ca782de05ec4bdfcb6035860b4040e4a6db2f81dd9b306369eaf67c4aa1ca"} Apr 23 01:12:20.970580 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.970554 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6q2sj" event={"ID":"dc212b92-93e6-442a-ba12-b470f57b4965","Type":"ContainerStarted","Data":"6db3888690a4ea731f9a11f025d3b6a7ea77542b3b4658fe178308599544c8f4"} Apr 23 01:12:20.970580 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:20.970577 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6q2sj" event={"ID":"dc212b92-93e6-442a-ba12-b470f57b4965","Type":"ContainerStarted","Data":"d906b74deb35af345634d12af185e431f34b7760e92cd129aa5b563bd8971bfb"} Apr 23 01:12:21.978049 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:21.977945 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6q2sj" event={"ID":"dc212b92-93e6-442a-ba12-b470f57b4965","Type":"ContainerStarted","Data":"7ce0fc79a17194f03c476460c23a8ced7162be2824bd42c5272b4c0a7d59dcd4"} Apr 23 01:12:21.979060 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:21.979038 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" event={"ID":"73635124-a810-416a-8482-00ba10f2ad6e","Type":"ContainerStarted","Data":"2449844cfba174ef426744a65e6df35e4eb86010ec97ca63d50daf2ca9448ec5"} Apr 23 01:12:21.979229 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:21.979205 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:12:21.998605 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:21.998566 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" podStartSLOduration=1.998555184 podStartE2EDuration="1.998555184s" podCreationTimestamp="2026-04-23 01:12:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:12:21.997392176 +0000 UTC m=+128.957245633" watchObservedRunningTime="2026-04-23 01:12:21.998555184 +0000 UTC m=+128.958408644" Apr 23 01:12:23.267637 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:23.267585 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:12:23.270770 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:23.270745 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14afdf01-fa2e-4563-8fbf-0cc2613b39ba-metrics-certs\") pod \"network-metrics-daemon-ps42z\" (UID: \"14afdf01-fa2e-4563-8fbf-0cc2613b39ba\") " pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:12:23.467799 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:23.467767 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj44d\"" Apr 23 01:12:23.475833 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:23.475812 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ps42z" Apr 23 01:12:23.592559 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:23.592533 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ps42z"] Apr 23 01:12:23.595292 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:12:23.595261 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14afdf01_fa2e_4563_8fbf_0cc2613b39ba.slice/crio-adec4fe57e6a1a23c4e60e77acc183aeb8ba9b98815981b1816c9b3121b49f62 WatchSource:0}: Error finding container adec4fe57e6a1a23c4e60e77acc183aeb8ba9b98815981b1816c9b3121b49f62: Status 404 returned error can't find the container with id adec4fe57e6a1a23c4e60e77acc183aeb8ba9b98815981b1816c9b3121b49f62 Apr 23 01:12:23.985236 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:23.985205 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6q2sj" event={"ID":"dc212b92-93e6-442a-ba12-b470f57b4965","Type":"ContainerStarted","Data":"87aebba70da5795c833e68550e4d517565984482142ade565c38f1af7dda1002"} Apr 23 01:12:23.986136 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:23.986116 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ps42z" event={"ID":"14afdf01-fa2e-4563-8fbf-0cc2613b39ba","Type":"ContainerStarted","Data":"adec4fe57e6a1a23c4e60e77acc183aeb8ba9b98815981b1816c9b3121b49f62"} Apr 23 01:12:24.004959 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:24.004922 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6q2sj" podStartSLOduration=1.3947002560000001 podStartE2EDuration="4.00491033s" podCreationTimestamp="2026-04-23 01:12:20 +0000 UTC" firstStartedPulling="2026-04-23 01:12:20.878661206 +0000 UTC m=+127.838514644" lastFinishedPulling="2026-04-23 01:12:23.488871279 +0000 UTC m=+130.448724718" observedRunningTime="2026-04-23 01:12:24.004407027 +0000 UTC m=+130.964260486" watchObservedRunningTime="2026-04-23 01:12:24.00491033 +0000 UTC m=+130.964763789" Apr 23 01:12:24.989757 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:24.989720 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ps42z" event={"ID":"14afdf01-fa2e-4563-8fbf-0cc2613b39ba","Type":"ContainerStarted","Data":"afc972224db1c8c073f01551b51542fdc526e08c13881f5d9291b5ea69f2c02c"} Apr 23 01:12:24.990105 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:24.989763 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ps42z" event={"ID":"14afdf01-fa2e-4563-8fbf-0cc2613b39ba","Type":"ContainerStarted","Data":"725db6b01d15297649f989a3dd72f589dc2ab76806e635453ff4b4be65ec7c2d"} Apr 23 01:12:25.005730 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:25.005688 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ps42z" podStartSLOduration=130.97996156 podStartE2EDuration="2m12.005674794s" podCreationTimestamp="2026-04-23 01:10:13 +0000 UTC" firstStartedPulling="2026-04-23 01:12:23.597059225 +0000 UTC m=+130.556912663" lastFinishedPulling="2026-04-23 01:12:24.622772444 +0000 UTC m=+131.582625897" observedRunningTime="2026-04-23 01:12:25.005234786 +0000 UTC m=+131.965088249" watchObservedRunningTime="2026-04-23 01:12:25.005674794 +0000 UTC m=+131.965528254" Apr 23 01:12:28.376818 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.376789 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-svn7q"] Apr 23 01:12:28.382556 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.382539 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.385145 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.385123 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 01:12:28.385270 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.385251 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 01:12:28.385354 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.385274 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 01:12:28.385354 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.385287 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 01:12:28.385354 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.385331 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 01:12:28.386217 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.386202 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-85prc\"" Apr 23 01:12:28.386291 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.386208 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 01:12:28.506902 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.506879 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-textfile\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.507030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.506908 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66e0905b-e676-4e53-a97a-bfbf17b1c22d-metrics-client-ca\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.507030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.506928 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-accelerators-collector-config\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.507030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.506944 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtzpm\" (UniqueName: \"kubernetes.io/projected/66e0905b-e676-4e53-a97a-bfbf17b1c22d-kube-api-access-jtzpm\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.507030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.506960 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-tls\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.507186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.507035 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-wtmp\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.507186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.507062 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-sys\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.507186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.507079 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.507186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.507125 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-root\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607526 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607503 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-root\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607602 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607546 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-textfile\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607602 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607568 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66e0905b-e676-4e53-a97a-bfbf17b1c22d-metrics-client-ca\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607602 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607593 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-accelerators-collector-config\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607725 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607596 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-root\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607725 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607616 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtzpm\" (UniqueName: \"kubernetes.io/projected/66e0905b-e676-4e53-a97a-bfbf17b1c22d-kube-api-access-jtzpm\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607725 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607641 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-tls\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607725 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607697 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-wtmp\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607899 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:28.607740 2565 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 01:12:28.607899 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607777 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-sys\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607899 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:28.607798 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-tls podName:66e0905b-e676-4e53-a97a-bfbf17b1c22d nodeName:}" failed. No retries permitted until 2026-04-23 01:12:29.107779069 +0000 UTC m=+136.067632507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-tls") pod "node-exporter-svn7q" (UID: "66e0905b-e676-4e53-a97a-bfbf17b1c22d") : secret "node-exporter-tls" not found Apr 23 01:12:28.607899 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607738 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-sys\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607899 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607829 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607899 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607881 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-wtmp\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.607899 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.607895 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-textfile\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.608241 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.608222 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66e0905b-e676-4e53-a97a-bfbf17b1c22d-metrics-client-ca\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.608276 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.608255 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-accelerators-collector-config\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.610032 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.610014 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:28.615551 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:28.615532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtzpm\" (UniqueName: \"kubernetes.io/projected/66e0905b-e676-4e53-a97a-bfbf17b1c22d-kube-api-access-jtzpm\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:29.112482 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.112448 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-tls\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:29.114550 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.114531 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/66e0905b-e676-4e53-a97a-bfbf17b1c22d-node-exporter-tls\") pod \"node-exporter-svn7q\" (UID: \"66e0905b-e676-4e53-a97a-bfbf17b1c22d\") " pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:29.291573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.291547 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-svn7q" Apr 23 01:12:29.298904 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:12:29.298881 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66e0905b_e676_4e53_a97a_bfbf17b1c22d.slice/crio-c41791f419f842e5ba18d7794676dd6b84ec77178a21a05f1926134d09108b82 WatchSource:0}: Error finding container c41791f419f842e5ba18d7794676dd6b84ec77178a21a05f1926134d09108b82: Status 404 returned error can't find the container with id c41791f419f842e5ba18d7794676dd6b84ec77178a21a05f1926134d09108b82 Apr 23 01:12:29.439067 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.439040 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:12:29.444702 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.444687 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.447150 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447125 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 01:12:29.447266 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447127 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 01:12:29.447337 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447297 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 01:12:29.447337 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447322 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 01:12:29.447449 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447366 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 01:12:29.447449 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447378 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 01:12:29.447610 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447588 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 01:12:29.447610 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447603 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 01:12:29.447714 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447621 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 01:12:29.447714 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.447623 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hrrqc\"" Apr 23 01:12:29.455859 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.455841 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:12:29.514653 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514634 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.514746 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514663 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.514746 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514691 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.514746 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514709 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.514933 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514758 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.514933 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514812 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-out\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.514933 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514839 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.514933 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514905 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t5wk\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-kube-api-access-8t5wk\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.515105 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514935 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-web-config\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.515105 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514952 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.515105 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.514972 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-volume\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.515105 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.515033 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.515105 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.515052 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.615684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.615662 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.615771 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.615693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.615771 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.615718 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.615771 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.615734 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.615771 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.615766 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.615923 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.615790 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-out\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.615923 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.615815 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.615923 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.615856 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t5wk\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-kube-api-access-8t5wk\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.616337 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.616315 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-web-config\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.616520 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.616495 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.616644 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.616629 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-volume\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.617098 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.617030 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.617098 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:29.616331 2565 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 01:12:29.619191 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.618945 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-out\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.619191 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.619148 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.620635 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.619570 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.620635 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.619655 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-volume\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.620635 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:29.619928 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls podName:16b34105-0f0d-47b8-9a52-6f45a6e31657 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:30.119884994 +0000 UTC m=+137.079738446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657") : secret "alertmanager-main-tls" not found Apr 23 01:12:29.620635 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.620042 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-web-config\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.620635 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.620069 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.620635 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.620113 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.620635 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.620426 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.620635 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:29.620529 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle podName:16b34105-0f0d-47b8-9a52-6f45a6e31657 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:30.120509114 +0000 UTC m=+137.080362562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657") : configmap references non-existent config key: ca-bundle.crt Apr 23 01:12:29.621099 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.620692 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.622773 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.621635 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.623626 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.623602 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:29.629895 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:29.629876 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t5wk\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-kube-api-access-8t5wk\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:30.004738 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:30.004701 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-svn7q" event={"ID":"66e0905b-e676-4e53-a97a-bfbf17b1c22d","Type":"ContainerStarted","Data":"c41791f419f842e5ba18d7794676dd6b84ec77178a21a05f1926134d09108b82"} Apr 23 01:12:30.124920 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:30.124895 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:30.125048 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:30.124937 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:30.125048 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:30.125027 2565 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 01:12:30.125180 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:30.125094 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls podName:16b34105-0f0d-47b8-9a52-6f45a6e31657 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:31.12507961 +0000 UTC m=+138.084933048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657") : secret "alertmanager-main-tls" not found Apr 23 01:12:30.126206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:30.126184 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:31.008165 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:31.008128 2565 generic.go:358] "Generic (PLEG): container finished" podID="66e0905b-e676-4e53-a97a-bfbf17b1c22d" containerID="3d2f33d073594884beb9b5ef85eb4809b63a0a717c27e04cc48753a99fcc83e2" exitCode=0 Apr 23 01:12:31.008722 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:31.008202 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-svn7q" event={"ID":"66e0905b-e676-4e53-a97a-bfbf17b1c22d","Type":"ContainerDied","Data":"3d2f33d073594884beb9b5ef85eb4809b63a0a717c27e04cc48753a99fcc83e2"} Apr 23 01:12:31.133025 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:31.133000 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:31.135088 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:31.135068 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:31.253853 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:31.253820 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:12:31.370653 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:31.370627 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:12:31.372814 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:12:31.372791 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b34105_0f0d_47b8_9a52_6f45a6e31657.slice/crio-741a860db03caee0572a9fed3b7b2bfcae5677ad7e5bc97452a5b99a61a8120c WatchSource:0}: Error finding container 741a860db03caee0572a9fed3b7b2bfcae5677ad7e5bc97452a5b99a61a8120c: Status 404 returned error can't find the container with id 741a860db03caee0572a9fed3b7b2bfcae5677ad7e5bc97452a5b99a61a8120c Apr 23 01:12:32.013082 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:32.013043 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-svn7q" event={"ID":"66e0905b-e676-4e53-a97a-bfbf17b1c22d","Type":"ContainerStarted","Data":"10bb3aaecc6346a6ac442709a0200a83f61ec7842a60a61936dba9b83c958609"} Apr 23 01:12:32.013517 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:32.013088 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-svn7q" event={"ID":"66e0905b-e676-4e53-a97a-bfbf17b1c22d","Type":"ContainerStarted","Data":"7a1b8bbd80a728d92ad9edc6b9c2353a4d8a6cc47f339611df214f56b9d1815b"} Apr 23 01:12:32.014190 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:32.014162 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerStarted","Data":"741a860db03caee0572a9fed3b7b2bfcae5677ad7e5bc97452a5b99a61a8120c"} Apr 23 01:12:32.032700 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:32.032655 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-svn7q" podStartSLOduration=3.232464555 podStartE2EDuration="4.032641798s" podCreationTimestamp="2026-04-23 01:12:28 +0000 UTC" firstStartedPulling="2026-04-23 01:12:29.300650144 +0000 UTC m=+136.260503600" lastFinishedPulling="2026-04-23 01:12:30.1008274 +0000 UTC m=+137.060680843" observedRunningTime="2026-04-23 01:12:32.030687453 +0000 UTC m=+138.990540909" watchObservedRunningTime="2026-04-23 01:12:32.032641798 +0000 UTC m=+138.992495308" Apr 23 01:12:33.018182 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:33.018148 2565 generic.go:358] "Generic (PLEG): container finished" podID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerID="a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b" exitCode=0 Apr 23 01:12:33.018570 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:33.018240 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerDied","Data":"a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b"} Apr 23 01:12:35.025937 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.025867 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerStarted","Data":"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2"} Apr 23 01:12:35.025937 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.025903 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerStarted","Data":"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f"} Apr 23 01:12:35.025937 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.025912 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerStarted","Data":"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5"} Apr 23 01:12:35.025937 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.025922 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerStarted","Data":"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070"} Apr 23 01:12:35.025937 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.025932 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerStarted","Data":"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3"} Apr 23 01:12:35.786349 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.786322 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-dbc6b7d67-srzrg"] Apr 23 01:12:35.789284 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.789268 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.791898 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.791872 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 01:12:35.792054 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.792037 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 01:12:35.792116 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.792071 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 01:12:35.793269 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.793251 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 01:12:35.793327 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.793307 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 01:12:35.793567 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.793540 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cm99s\"" Apr 23 01:12:35.793627 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.793575 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 01:12:35.793793 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.793778 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 01:12:35.798476 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.798448 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 01:12:35.800920 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.800891 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dbc6b7d67-srzrg"] Apr 23 01:12:35.869338 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.869319 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-config\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.869440 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.869343 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-oauth-serving-cert\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.869440 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.869368 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-service-ca\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.869525 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.869444 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj4dn\" (UniqueName: \"kubernetes.io/projected/6a904bd4-6761-4b64-bf27-0064d6382f1a-kube-api-access-hj4dn\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.869525 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.869472 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-oauth-config\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.869525 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.869490 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-trusted-ca-bundle\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.869525 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.869521 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-serving-cert\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.970300 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.970282 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-serving-cert\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.970410 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.970306 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-config\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.970410 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.970324 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-oauth-serving-cert\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.970410 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.970349 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-service-ca\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.970410 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.970392 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj4dn\" (UniqueName: \"kubernetes.io/projected/6a904bd4-6761-4b64-bf27-0064d6382f1a-kube-api-access-hj4dn\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.970570 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.970419 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-oauth-config\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.970570 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.970435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-trusted-ca-bundle\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.971117 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.971093 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-service-ca\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.971117 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.971109 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-config\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.971356 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.971334 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-trusted-ca-bundle\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.971546 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.971528 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-oauth-serving-cert\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.972849 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.972825 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-serving-cert\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.972952 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.972934 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-oauth-config\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:35.979256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:35.979236 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj4dn\" (UniqueName: \"kubernetes.io/projected/6a904bd4-6761-4b64-bf27-0064d6382f1a-kube-api-access-hj4dn\") pod \"console-dbc6b7d67-srzrg\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:36.030373 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:36.030349 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerStarted","Data":"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7"} Apr 23 01:12:36.058372 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:36.058290 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.0496011 podStartE2EDuration="7.0582765s" podCreationTimestamp="2026-04-23 01:12:29 +0000 UTC" firstStartedPulling="2026-04-23 01:12:31.374640801 +0000 UTC m=+138.334494239" lastFinishedPulling="2026-04-23 01:12:35.383316194 +0000 UTC m=+142.343169639" observedRunningTime="2026-04-23 01:12:36.056119689 +0000 UTC m=+143.015973149" watchObservedRunningTime="2026-04-23 01:12:36.0582765 +0000 UTC m=+143.018129960" Apr 23 01:12:36.100514 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:36.100497 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:36.209819 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:36.209764 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dbc6b7d67-srzrg"] Apr 23 01:12:36.212296 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:12:36.212269 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a904bd4_6761_4b64_bf27_0064d6382f1a.slice/crio-de7532fc08ddac550418a39997db6c97285e6a40e4f456b5abfa3b69d4c16d6f WatchSource:0}: Error finding container de7532fc08ddac550418a39997db6c97285e6a40e4f456b5abfa3b69d4c16d6f: Status 404 returned error can't find the container with id de7532fc08ddac550418a39997db6c97285e6a40e4f456b5abfa3b69d4c16d6f Apr 23 01:12:37.034405 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:37.034361 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dbc6b7d67-srzrg" event={"ID":"6a904bd4-6761-4b64-bf27-0064d6382f1a","Type":"ContainerStarted","Data":"de7532fc08ddac550418a39997db6c97285e6a40e4f456b5abfa3b69d4c16d6f"} Apr 23 01:12:39.041144 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:39.041105 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dbc6b7d67-srzrg" event={"ID":"6a904bd4-6761-4b64-bf27-0064d6382f1a","Type":"ContainerStarted","Data":"3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594"} Apr 23 01:12:39.058464 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:39.058379 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dbc6b7d67-srzrg" podStartSLOduration=1.469425231 podStartE2EDuration="4.05836576s" podCreationTimestamp="2026-04-23 01:12:35 +0000 UTC" firstStartedPulling="2026-04-23 01:12:36.214002589 +0000 UTC m=+143.173856027" lastFinishedPulling="2026-04-23 01:12:38.802943103 +0000 UTC m=+145.762796556" observedRunningTime="2026-04-23 01:12:39.056668416 +0000 UTC m=+146.016521877" watchObservedRunningTime="2026-04-23 01:12:39.05836576 +0000 UTC m=+146.018219220" Apr 23 01:12:40.826604 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:40.826551 2565 patch_prober.go:28] interesting pod/image-registry-86f775b4f-hdvw2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 01:12:40.826961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:40.826623 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" podUID="73635124-a810-416a-8482-00ba10f2ad6e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 01:12:42.985993 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:42.985949 2565 patch_prober.go:28] interesting pod/image-registry-86f775b4f-hdvw2 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 01:12:42.986347 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:42.986007 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" podUID="73635124-a810-416a-8482-00ba10f2ad6e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 01:12:46.101331 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:46.101300 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:46.101331 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:46.101337 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:12:46.102514 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:46.102495 2565 patch_prober.go:28] interesting pod/console-dbc6b7d67-srzrg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.15:8443/health\": dial tcp 10.134.0.15:8443: connect: connection refused" start-of-body= Apr 23 01:12:46.102571 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:46.102535 2565 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-dbc6b7d67-srzrg" podUID="6a904bd4-6761-4b64-bf27-0064d6382f1a" containerName="console" probeResult="failure" output="Get \"https://10.134.0.15:8443/health\": dial tcp 10.134.0.15:8443: connect: connection refused" Apr 23 01:12:48.428907 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:48.428867 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-czwb8" podUID="2efc1102-e677-4cde-b6a8-5304536665ad" Apr 23 01:12:48.451140 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:12:48.451111 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-p8psz" podUID="9342de38-1c25-496b-b531-420cff35d1e6" Apr 23 01:12:49.066300 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:49.066271 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-czwb8" Apr 23 01:12:50.826814 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:50.826778 2565 patch_prober.go:28] interesting pod/image-registry-86f775b4f-hdvw2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 01:12:50.827180 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:50.826824 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" podUID="73635124-a810-416a-8482-00ba10f2ad6e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 01:12:52.985603 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:52.985570 2565 patch_prober.go:28] interesting pod/image-registry-86f775b4f-hdvw2 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 01:12:52.985959 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:52.985618 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" podUID="73635124-a810-416a-8482-00ba10f2ad6e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 01:12:53.397314 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:53.397288 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:12:53.397451 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:53.397347 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:12:53.399654 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:53.399623 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2efc1102-e677-4cde-b6a8-5304536665ad-metrics-tls\") pod \"dns-default-czwb8\" (UID: \"2efc1102-e677-4cde-b6a8-5304536665ad\") " pod="openshift-dns/dns-default-czwb8" Apr 23 01:12:53.399758 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:53.399674 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9342de38-1c25-496b-b531-420cff35d1e6-cert\") pod \"ingress-canary-p8psz\" (UID: \"9342de38-1c25-496b-b531-420cff35d1e6\") " pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:12:53.569843 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:53.569813 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4q4b\"" Apr 23 01:12:53.577894 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:53.577874 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-czwb8" Apr 23 01:12:53.686160 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:53.686093 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-czwb8"] Apr 23 01:12:53.688748 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:12:53.688706 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2efc1102_e677_4cde_b6a8_5304536665ad.slice/crio-637ee9da082f3fc8bf53a2d53d5fa4937ee54386ae1d8344607186f85d3489f5 WatchSource:0}: Error finding container 637ee9da082f3fc8bf53a2d53d5fa4937ee54386ae1d8344607186f85d3489f5: Status 404 returned error can't find the container with id 637ee9da082f3fc8bf53a2d53d5fa4937ee54386ae1d8344607186f85d3489f5 Apr 23 01:12:54.079830 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:54.079762 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-czwb8" event={"ID":"2efc1102-e677-4cde-b6a8-5304536665ad","Type":"ContainerStarted","Data":"637ee9da082f3fc8bf53a2d53d5fa4937ee54386ae1d8344607186f85d3489f5"} Apr 23 01:12:55.084286 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:55.084254 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-czwb8" event={"ID":"2efc1102-e677-4cde-b6a8-5304536665ad","Type":"ContainerStarted","Data":"ba7800e4670e945d7a72de9d03e39a9ffca10268c21f1d85eadb3bf119b7b91f"} Apr 23 01:12:56.088691 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:56.088653 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-czwb8" event={"ID":"2efc1102-e677-4cde-b6a8-5304536665ad","Type":"ContainerStarted","Data":"be06548a0b0002582b68d2e852ffd983a421248dae4c833e6ffb20fdbcdae72e"} Apr 23 01:12:56.089049 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:56.088770 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-czwb8" Apr 23 01:12:56.101557 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:56.101535 2565 patch_prober.go:28] interesting pod/console-dbc6b7d67-srzrg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.15:8443/health\": dial tcp 10.134.0.15:8443: connect: connection refused" start-of-body= Apr 23 01:12:56.101690 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:56.101573 2565 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-dbc6b7d67-srzrg" podUID="6a904bd4-6761-4b64-bf27-0064d6382f1a" containerName="console" probeResult="failure" output="Get \"https://10.134.0.15:8443/health\": dial tcp 10.134.0.15:8443: connect: connection refused" Apr 23 01:12:56.105365 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:12:56.105329 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-czwb8" podStartSLOduration=129.891194923 podStartE2EDuration="2m11.105317473s" podCreationTimestamp="2026-04-23 01:10:45 +0000 UTC" firstStartedPulling="2026-04-23 01:12:53.690594916 +0000 UTC m=+160.650448354" lastFinishedPulling="2026-04-23 01:12:54.904717458 +0000 UTC m=+161.864570904" observedRunningTime="2026-04-23 01:12:56.103731481 +0000 UTC m=+163.063584952" watchObservedRunningTime="2026-04-23 01:12:56.105317473 +0000 UTC m=+163.065170934" Apr 23 01:13:00.826603 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:00.826568 2565 patch_prober.go:28] interesting pod/image-registry-86f775b4f-hdvw2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 01:13:00.827154 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:00.826621 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" podUID="73635124-a810-416a-8482-00ba10f2ad6e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 01:13:00.827154 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:00.826660 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:13:00.827154 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:00.827122 2565 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"2449844cfba174ef426744a65e6df35e4eb86010ec97ca63d50daf2ca9448ec5"} pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 01:13:00.830390 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:00.830363 2565 patch_prober.go:28] interesting pod/image-registry-86f775b4f-hdvw2 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 01:13:00.830508 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:00.830408 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" podUID="73635124-a810-416a-8482-00ba10f2ad6e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 01:13:03.557433 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:03.557368 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:13:03.560340 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:03.560322 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jjv8h\"" Apr 23 01:13:03.568390 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:03.568375 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p8psz" Apr 23 01:13:03.678072 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:03.678047 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p8psz"] Apr 23 01:13:03.681078 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:13:03.681053 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9342de38_1c25_496b_b531_420cff35d1e6.slice/crio-0618e84490f522b33eb44515221f1a1a64cb8b30aa6ebcd85aa9fb79b4393e91 WatchSource:0}: Error finding container 0618e84490f522b33eb44515221f1a1a64cb8b30aa6ebcd85aa9fb79b4393e91: Status 404 returned error can't find the container with id 0618e84490f522b33eb44515221f1a1a64cb8b30aa6ebcd85aa9fb79b4393e91 Apr 23 01:13:03.849057 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:03.848996 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dbc6b7d67-srzrg"] Apr 23 01:13:04.110532 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:04.110457 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p8psz" event={"ID":"9342de38-1c25-496b-b531-420cff35d1e6","Type":"ContainerStarted","Data":"0618e84490f522b33eb44515221f1a1a64cb8b30aa6ebcd85aa9fb79b4393e91"} Apr 23 01:13:06.093161 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:06.093127 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-czwb8" Apr 23 01:13:06.121914 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:06.121882 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p8psz" event={"ID":"9342de38-1c25-496b-b531-420cff35d1e6","Type":"ContainerStarted","Data":"bef4fe7b04a87c21b9a08b720d5e3a2a583c18ea0f78ed4c564712ba5d6b25b8"} Apr 23 01:13:06.137624 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:06.137579 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p8psz" podStartSLOduration=139.723877678 podStartE2EDuration="2m21.137566652s" podCreationTimestamp="2026-04-23 01:10:45 +0000 UTC" firstStartedPulling="2026-04-23 01:13:03.68341014 +0000 UTC m=+170.643263577" lastFinishedPulling="2026-04-23 01:13:05.097099114 +0000 UTC m=+172.056952551" observedRunningTime="2026-04-23 01:13:06.136407569 +0000 UTC m=+173.096261028" watchObservedRunningTime="2026-04-23 01:13:06.137566652 +0000 UTC m=+173.097420112" Apr 23 01:13:10.830787 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:10.830761 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:13:16.150645 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:16.150614 2565 generic.go:358] "Generic (PLEG): container finished" podID="8cfda3ef-b3c6-4cb0-8e04-57fefbef4404" containerID="e25ff28339a9f727224c90b70b40dae141fddef7afb073f32b0ab272806b2f98" exitCode=0 Apr 23 01:13:16.151046 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:16.150658 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" event={"ID":"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404","Type":"ContainerDied","Data":"e25ff28339a9f727224c90b70b40dae141fddef7afb073f32b0ab272806b2f98"} Apr 23 01:13:16.151046 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:16.150964 2565 scope.go:117] "RemoveContainer" containerID="e25ff28339a9f727224c90b70b40dae141fddef7afb073f32b0ab272806b2f98" Apr 23 01:13:17.155110 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:17.155065 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b9dl5" event={"ID":"8cfda3ef-b3c6-4cb0-8e04-57fefbef4404","Type":"ContainerStarted","Data":"1fa45194bdec4f1bc7fadf2e4a9bd9f0a126ddf499fc039831fa9ce198d770d0"} Apr 23 01:13:25.845355 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:25.845309 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" podUID="73635124-a810-416a-8482-00ba10f2ad6e" containerName="registry" containerID="cri-o://2449844cfba174ef426744a65e6df35e4eb86010ec97ca63d50daf2ca9448ec5" gracePeriod=30 Apr 23 01:13:27.183728 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:27.183693 2565 generic.go:358] "Generic (PLEG): container finished" podID="73635124-a810-416a-8482-00ba10f2ad6e" containerID="2449844cfba174ef426744a65e6df35e4eb86010ec97ca63d50daf2ca9448ec5" exitCode=0 Apr 23 01:13:27.184112 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:27.183773 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" event={"ID":"73635124-a810-416a-8482-00ba10f2ad6e","Type":"ContainerDied","Data":"2449844cfba174ef426744a65e6df35e4eb86010ec97ca63d50daf2ca9448ec5"} Apr 23 01:13:27.184112 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:27.183812 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" event={"ID":"73635124-a810-416a-8482-00ba10f2ad6e","Type":"ContainerStarted","Data":"d738fb97f6e88f14b212391268b3a076e8836dd5de4f8ead7f92636ec9c663ff"} Apr 23 01:13:27.184112 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:27.183859 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:13:28.867317 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:28.867262 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-dbc6b7d67-srzrg" podUID="6a904bd4-6761-4b64-bf27-0064d6382f1a" containerName="console" containerID="cri-o://3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594" gracePeriod=15 Apr 23 01:13:29.098287 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.098266 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dbc6b7d67-srzrg_6a904bd4-6761-4b64-bf27-0064d6382f1a/console/0.log" Apr 23 01:13:29.098386 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.098336 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:13:29.191127 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.191104 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dbc6b7d67-srzrg_6a904bd4-6761-4b64-bf27-0064d6382f1a/console/0.log" Apr 23 01:13:29.191263 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.191142 2565 generic.go:358] "Generic (PLEG): container finished" podID="6a904bd4-6761-4b64-bf27-0064d6382f1a" containerID="3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594" exitCode=2 Apr 23 01:13:29.191263 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.191195 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dbc6b7d67-srzrg" event={"ID":"6a904bd4-6761-4b64-bf27-0064d6382f1a","Type":"ContainerDied","Data":"3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594"} Apr 23 01:13:29.191263 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.191215 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dbc6b7d67-srzrg" event={"ID":"6a904bd4-6761-4b64-bf27-0064d6382f1a","Type":"ContainerDied","Data":"de7532fc08ddac550418a39997db6c97285e6a40e4f456b5abfa3b69d4c16d6f"} Apr 23 01:13:29.191263 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.191229 2565 scope.go:117] "RemoveContainer" containerID="3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594" Apr 23 01:13:29.191263 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.191231 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dbc6b7d67-srzrg" Apr 23 01:13:29.200459 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.200381 2565 scope.go:117] "RemoveContainer" containerID="3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594" Apr 23 01:13:29.201160 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:13:29.201134 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594\": container with ID starting with 3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594 not found: ID does not exist" containerID="3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594" Apr 23 01:13:29.201243 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.201167 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594"} err="failed to get container status \"3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594\": rpc error: code = NotFound desc = could not find container \"3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594\": container with ID starting with 3b78bd6c0c1421d732e376a6c7c9c9814c46ba4695e92305c5c3af08ca956594 not found: ID does not exist" Apr 23 01:13:29.249412 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249391 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-oauth-config\") pod \"6a904bd4-6761-4b64-bf27-0064d6382f1a\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " Apr 23 01:13:29.249500 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249438 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj4dn\" (UniqueName: \"kubernetes.io/projected/6a904bd4-6761-4b64-bf27-0064d6382f1a-kube-api-access-hj4dn\") pod \"6a904bd4-6761-4b64-bf27-0064d6382f1a\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " Apr 23 01:13:29.249500 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249455 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-serving-cert\") pod \"6a904bd4-6761-4b64-bf27-0064d6382f1a\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " Apr 23 01:13:29.249500 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249477 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-oauth-serving-cert\") pod \"6a904bd4-6761-4b64-bf27-0064d6382f1a\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " Apr 23 01:13:29.249500 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249500 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-config\") pod \"6a904bd4-6761-4b64-bf27-0064d6382f1a\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " Apr 23 01:13:29.249703 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249527 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-service-ca\") pod \"6a904bd4-6761-4b64-bf27-0064d6382f1a\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " Apr 23 01:13:29.249703 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249558 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-trusted-ca-bundle\") pod \"6a904bd4-6761-4b64-bf27-0064d6382f1a\" (UID: \"6a904bd4-6761-4b64-bf27-0064d6382f1a\") " Apr 23 01:13:29.250010 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249932 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-service-ca" (OuterVolumeSpecName: "service-ca") pod "6a904bd4-6761-4b64-bf27-0064d6382f1a" (UID: "6a904bd4-6761-4b64-bf27-0064d6382f1a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:29.250010 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249942 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-config" (OuterVolumeSpecName: "console-config") pod "6a904bd4-6761-4b64-bf27-0064d6382f1a" (UID: "6a904bd4-6761-4b64-bf27-0064d6382f1a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:29.250010 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.249962 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6a904bd4-6761-4b64-bf27-0064d6382f1a" (UID: "6a904bd4-6761-4b64-bf27-0064d6382f1a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:29.250172 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.250047 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6a904bd4-6761-4b64-bf27-0064d6382f1a" (UID: "6a904bd4-6761-4b64-bf27-0064d6382f1a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:29.251514 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.251494 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6a904bd4-6761-4b64-bf27-0064d6382f1a" (UID: "6a904bd4-6761-4b64-bf27-0064d6382f1a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:29.251860 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.251843 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a904bd4-6761-4b64-bf27-0064d6382f1a-kube-api-access-hj4dn" (OuterVolumeSpecName: "kube-api-access-hj4dn") pod "6a904bd4-6761-4b64-bf27-0064d6382f1a" (UID: "6a904bd4-6761-4b64-bf27-0064d6382f1a"). InnerVolumeSpecName "kube-api-access-hj4dn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:29.251860 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.251845 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6a904bd4-6761-4b64-bf27-0064d6382f1a" (UID: "6a904bd4-6761-4b64-bf27-0064d6382f1a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:29.350427 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.350409 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-oauth-config\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:29.350427 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.350427 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hj4dn\" (UniqueName: \"kubernetes.io/projected/6a904bd4-6761-4b64-bf27-0064d6382f1a-kube-api-access-hj4dn\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:29.350532 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.350437 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-serving-cert\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:29.350532 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.350447 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-oauth-serving-cert\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:29.350532 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.350455 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-console-config\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:29.350532 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.350463 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-service-ca\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:29.350532 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.350471 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a904bd4-6761-4b64-bf27-0064d6382f1a-trusted-ca-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:29.513705 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.513672 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dbc6b7d67-srzrg"] Apr 23 01:13:29.516671 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.516652 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dbc6b7d67-srzrg"] Apr 23 01:13:29.559169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:29.559139 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a904bd4-6761-4b64-bf27-0064d6382f1a" path="/var/lib/kubelet/pods/6a904bd4-6761-4b64-bf27-0064d6382f1a/volumes" Apr 23 01:13:48.038531 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.038499 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fd49b8965-hclkn"] Apr 23 01:13:48.039061 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.038763 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a904bd4-6761-4b64-bf27-0064d6382f1a" containerName="console" Apr 23 01:13:48.039061 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.038775 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a904bd4-6761-4b64-bf27-0064d6382f1a" containerName="console" Apr 23 01:13:48.039061 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.038822 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a904bd4-6761-4b64-bf27-0064d6382f1a" containerName="console" Apr 23 01:13:48.044562 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.044544 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.047378 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.047353 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 01:13:48.048735 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.048713 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 01:13:48.048735 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.048732 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 01:13:48.048951 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.048767 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 01:13:48.048951 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.048735 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cm99s\"" Apr 23 01:13:48.049084 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.049049 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 01:13:48.049143 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.049114 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 01:13:48.050698 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.050649 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 01:13:48.052375 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.052356 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fd49b8965-hclkn"] Apr 23 01:13:48.054738 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.054719 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 01:13:48.180126 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.180099 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-oauth-serving-cert\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.180244 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.180130 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-trusted-ca-bundle\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.180244 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.180149 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-oauth-config\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.180244 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.180175 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-console-config\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.180408 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.180257 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpdhz\" (UniqueName: \"kubernetes.io/projected/b5faffb9-3444-4db1-869e-e329c8f61648-kube-api-access-fpdhz\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.180408 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.180300 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-serving-cert\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.180408 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.180329 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-service-ca\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.191369 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.191348 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86f775b4f-hdvw2" Apr 23 01:13:48.281033 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.280952 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-service-ca\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281182 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281042 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-oauth-serving-cert\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281182 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281074 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-trusted-ca-bundle\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281182 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281101 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-oauth-config\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281182 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-console-config\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281182 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281177 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpdhz\" (UniqueName: \"kubernetes.io/projected/b5faffb9-3444-4db1-869e-e329c8f61648-kube-api-access-fpdhz\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281444 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281214 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-serving-cert\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281774 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281749 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-service-ca\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281871 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281802 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-oauth-serving-cert\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281923 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281883 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-console-config\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.281994 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.281957 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-trusted-ca-bundle\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.283673 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.283651 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-serving-cert\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.283755 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.283669 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-oauth-config\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.289761 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.289712 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpdhz\" (UniqueName: \"kubernetes.io/projected/b5faffb9-3444-4db1-869e-e329c8f61648-kube-api-access-fpdhz\") pod \"console-5fd49b8965-hclkn\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.354868 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.354846 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:48.465055 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.465032 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fd49b8965-hclkn"] Apr 23 01:13:48.466898 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:13:48.466867 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5faffb9_3444_4db1_869e_e329c8f61648.slice/crio-5feeb43580e6e566561048fb5d6d25d5ff8f7fec120b946f4c0f902513ea6ddc WatchSource:0}: Error finding container 5feeb43580e6e566561048fb5d6d25d5ff8f7fec120b946f4c0f902513ea6ddc: Status 404 returned error can't find the container with id 5feeb43580e6e566561048fb5d6d25d5ff8f7fec120b946f4c0f902513ea6ddc Apr 23 01:13:48.637213 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.637126 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:13:48.637575 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.637550 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="alertmanager" containerID="cri-o://171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3" gracePeriod=120 Apr 23 01:13:48.637650 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.637608 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy-metric" containerID="cri-o://f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2" gracePeriod=120 Apr 23 01:13:48.637710 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.637689 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="config-reloader" containerID="cri-o://3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070" gracePeriod=120 Apr 23 01:13:48.637758 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.637677 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy" containerID="cri-o://c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f" gracePeriod=120 Apr 23 01:13:48.637758 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.637684 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="prom-label-proxy" containerID="cri-o://f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7" gracePeriod=120 Apr 23 01:13:48.637852 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:48.637690 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy-web" containerID="cri-o://e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5" gracePeriod=120 Apr 23 01:13:49.245886 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.245853 2565 generic.go:358] "Generic (PLEG): container finished" podID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerID="f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7" exitCode=0 Apr 23 01:13:49.245886 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.245880 2565 generic.go:358] "Generic (PLEG): container finished" podID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerID="c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f" exitCode=0 Apr 23 01:13:49.245886 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.245887 2565 generic.go:358] "Generic (PLEG): container finished" podID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerID="3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070" exitCode=0 Apr 23 01:13:49.245886 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.245894 2565 generic.go:358] "Generic (PLEG): container finished" podID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerID="171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3" exitCode=0 Apr 23 01:13:49.246448 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.245925 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerDied","Data":"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7"} Apr 23 01:13:49.246448 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.245961 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerDied","Data":"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f"} Apr 23 01:13:49.246448 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.245990 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerDied","Data":"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070"} Apr 23 01:13:49.246448 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.246004 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerDied","Data":"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3"} Apr 23 01:13:49.247302 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.247282 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd49b8965-hclkn" event={"ID":"b5faffb9-3444-4db1-869e-e329c8f61648","Type":"ContainerStarted","Data":"3d12a0fbffa36fec0194641e6d73c9ee559a64e8c1d49217bba78211f602cbd6"} Apr 23 01:13:49.247420 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.247306 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd49b8965-hclkn" event={"ID":"b5faffb9-3444-4db1-869e-e329c8f61648","Type":"ContainerStarted","Data":"5feeb43580e6e566561048fb5d6d25d5ff8f7fec120b946f4c0f902513ea6ddc"} Apr 23 01:13:49.263649 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.263609 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fd49b8965-hclkn" podStartSLOduration=1.263593165 podStartE2EDuration="1.263593165s" podCreationTimestamp="2026-04-23 01:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:13:49.262510596 +0000 UTC m=+216.222364047" watchObservedRunningTime="2026-04-23 01:13:49.263593165 +0000 UTC m=+216.223446626" Apr 23 01:13:49.874482 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.874462 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:49.991330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991254 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-main-db\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991298 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-tls-assets\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991315 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-volume\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991578 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991347 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-out\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991578 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991380 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-cluster-tls-config\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991578 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991408 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-metric\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991578 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991438 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t5wk\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-kube-api-access-8t5wk\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991771 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991629 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:13:49.991834 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991789 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-web-config\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991889 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991837 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991939 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991888 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-metrics-client-ca\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.991939 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991933 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-web\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.992068 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.991962 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.992068 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.992028 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle\") pod \"16b34105-0f0d-47b8-9a52-6f45a6e31657\" (UID: \"16b34105-0f0d-47b8-9a52-6f45a6e31657\") " Apr 23 01:13:49.992891 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.992265 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-main-db\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:49.992891 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.992656 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:49.993122 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.993095 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:49.994106 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.994055 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-volume" (OuterVolumeSpecName: "config-volume") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:49.994431 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.994405 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-out" (OuterVolumeSpecName: "config-out") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:13:49.994525 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.994495 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:49.994525 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.994499 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-kube-api-access-8t5wk" (OuterVolumeSpecName: "kube-api-access-8t5wk") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "kube-api-access-8t5wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:49.995302 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.995279 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:49.995573 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.995556 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:49.995660 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.995587 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:49.995907 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.995891 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:49.998339 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:49.998315 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:50.003577 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.003558 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-web-config" (OuterVolumeSpecName: "web-config") pod "16b34105-0f0d-47b8-9a52-6f45a6e31657" (UID: "16b34105-0f0d-47b8-9a52-6f45a6e31657"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:50.093540 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093511 2565 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-metrics-client-ca\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093540 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093541 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093551 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-main-tls\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093560 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16b34105-0f0d-47b8-9a52-6f45a6e31657-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093569 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-tls-assets\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093578 2565 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-volume\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093585 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16b34105-0f0d-47b8-9a52-6f45a6e31657-config-out\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093594 2565 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-cluster-tls-config\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093620 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093630 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8t5wk\" (UniqueName: \"kubernetes.io/projected/16b34105-0f0d-47b8-9a52-6f45a6e31657-kube-api-access-8t5wk\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093639 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-web-config\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.093656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.093648 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16b34105-0f0d-47b8-9a52-6f45a6e31657-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:13:50.253625 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.253561 2565 generic.go:358] "Generic (PLEG): container finished" podID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerID="f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2" exitCode=0 Apr 23 01:13:50.253625 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.253581 2565 generic.go:358] "Generic (PLEG): container finished" podID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerID="e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5" exitCode=0 Apr 23 01:13:50.254009 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.253637 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerDied","Data":"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2"} Apr 23 01:13:50.254009 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.253674 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.254009 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.253688 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerDied","Data":"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5"} Apr 23 01:13:50.254009 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.253703 2565 scope.go:117] "RemoveContainer" containerID="f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7" Apr 23 01:13:50.254009 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.253706 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16b34105-0f0d-47b8-9a52-6f45a6e31657","Type":"ContainerDied","Data":"741a860db03caee0572a9fed3b7b2bfcae5677ad7e5bc97452a5b99a61a8120c"} Apr 23 01:13:50.261057 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.261034 2565 scope.go:117] "RemoveContainer" containerID="f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2" Apr 23 01:13:50.267470 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.267451 2565 scope.go:117] "RemoveContainer" containerID="c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f" Apr 23 01:13:50.277464 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.277441 2565 scope.go:117] "RemoveContainer" containerID="e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5" Apr 23 01:13:50.283412 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.283392 2565 scope.go:117] "RemoveContainer" containerID="3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070" Apr 23 01:13:50.293359 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.293334 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:13:50.297103 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.297084 2565 scope.go:117] "RemoveContainer" containerID="171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3" Apr 23 01:13:50.303244 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.303224 2565 scope.go:117] "RemoveContainer" containerID="a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b" Apr 23 01:13:50.306817 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.306798 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:13:50.309597 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.309582 2565 scope.go:117] "RemoveContainer" containerID="f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7" Apr 23 01:13:50.309835 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:13:50.309818 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7\": container with ID starting with f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7 not found: ID does not exist" containerID="f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7" Apr 23 01:13:50.309883 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.309841 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7"} err="failed to get container status \"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7\": rpc error: code = NotFound desc = could not find container \"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7\": container with ID starting with f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7 not found: ID does not exist" Apr 23 01:13:50.309883 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.309858 2565 scope.go:117] "RemoveContainer" containerID="f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2" Apr 23 01:13:50.310136 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:13:50.310120 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2\": container with ID starting with f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2 not found: ID does not exist" containerID="f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2" Apr 23 01:13:50.310180 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310139 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2"} err="failed to get container status \"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2\": rpc error: code = NotFound desc = could not find container \"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2\": container with ID starting with f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2 not found: ID does not exist" Apr 23 01:13:50.310180 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310152 2565 scope.go:117] "RemoveContainer" containerID="c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f" Apr 23 01:13:50.310340 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:13:50.310324 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f\": container with ID starting with c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f not found: ID does not exist" containerID="c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f" Apr 23 01:13:50.310387 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310343 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f"} err="failed to get container status \"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f\": rpc error: code = NotFound desc = could not find container \"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f\": container with ID starting with c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f not found: ID does not exist" Apr 23 01:13:50.310387 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310356 2565 scope.go:117] "RemoveContainer" containerID="e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5" Apr 23 01:13:50.310551 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:13:50.310536 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5\": container with ID starting with e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5 not found: ID does not exist" containerID="e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5" Apr 23 01:13:50.310588 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310555 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5"} err="failed to get container status \"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5\": rpc error: code = NotFound desc = could not find container \"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5\": container with ID starting with e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5 not found: ID does not exist" Apr 23 01:13:50.310588 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310567 2565 scope.go:117] "RemoveContainer" containerID="3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070" Apr 23 01:13:50.310741 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:13:50.310726 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070\": container with ID starting with 3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070 not found: ID does not exist" containerID="3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070" Apr 23 01:13:50.310779 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310743 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070"} err="failed to get container status \"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070\": rpc error: code = NotFound desc = could not find container \"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070\": container with ID starting with 3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070 not found: ID does not exist" Apr 23 01:13:50.310779 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310754 2565 scope.go:117] "RemoveContainer" containerID="171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3" Apr 23 01:13:50.310926 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:13:50.310911 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3\": container with ID starting with 171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3 not found: ID does not exist" containerID="171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3" Apr 23 01:13:50.310964 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310929 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3"} err="failed to get container status \"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3\": rpc error: code = NotFound desc = could not find container \"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3\": container with ID starting with 171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3 not found: ID does not exist" Apr 23 01:13:50.310964 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.310940 2565 scope.go:117] "RemoveContainer" containerID="a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b" Apr 23 01:13:50.311206 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:13:50.311190 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b\": container with ID starting with a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b not found: ID does not exist" containerID="a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b" Apr 23 01:13:50.311246 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311212 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b"} err="failed to get container status \"a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b\": rpc error: code = NotFound desc = could not find container \"a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b\": container with ID starting with a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b not found: ID does not exist" Apr 23 01:13:50.311246 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311227 2565 scope.go:117] "RemoveContainer" containerID="f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7" Apr 23 01:13:50.311432 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311416 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7"} err="failed to get container status \"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7\": rpc error: code = NotFound desc = could not find container \"f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7\": container with ID starting with f37d92d5e4f72ce305418f455149f8c7670ac387053857c613db89f3fe6dc0f7 not found: ID does not exist" Apr 23 01:13:50.311432 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311431 2565 scope.go:117] "RemoveContainer" containerID="f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2" Apr 23 01:13:50.311586 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311569 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2"} err="failed to get container status \"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2\": rpc error: code = NotFound desc = could not find container \"f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2\": container with ID starting with f577f6fe70e7c4744ed709a223ceb9eeec5d9748450179c589849cdacf514dd2 not found: ID does not exist" Apr 23 01:13:50.311625 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311588 2565 scope.go:117] "RemoveContainer" containerID="c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f" Apr 23 01:13:50.311778 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311763 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f"} err="failed to get container status \"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f\": rpc error: code = NotFound desc = could not find container \"c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f\": container with ID starting with c7687192c01e4cbf339cfb011f44f7c581dd2757b597aacb2a2bfc4b9b84b22f not found: ID does not exist" Apr 23 01:13:50.311778 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311777 2565 scope.go:117] "RemoveContainer" containerID="e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5" Apr 23 01:13:50.311995 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311959 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5"} err="failed to get container status \"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5\": rpc error: code = NotFound desc = could not find container \"e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5\": container with ID starting with e505947495b29267eba3ae65fa9880e8434684bebdd7af859216459e89d99af5 not found: ID does not exist" Apr 23 01:13:50.312060 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.311971 2565 scope.go:117] "RemoveContainer" containerID="3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070" Apr 23 01:13:50.312225 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.312208 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070"} err="failed to get container status \"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070\": rpc error: code = NotFound desc = could not find container \"3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070\": container with ID starting with 3b14cd6f7c9a41de0141fdaea798dac8363a3c8bfbd7e483871325b34b227070 not found: ID does not exist" Apr 23 01:13:50.312225 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.312224 2565 scope.go:117] "RemoveContainer" containerID="171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3" Apr 23 01:13:50.312445 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.312415 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3"} err="failed to get container status \"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3\": rpc error: code = NotFound desc = could not find container \"171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3\": container with ID starting with 171ce3322ffa836372c1cbe9872b3c940657482f9be4a20553bdedfda9e0def3 not found: ID does not exist" Apr 23 01:13:50.312445 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.312439 2565 scope.go:117] "RemoveContainer" containerID="a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b" Apr 23 01:13:50.312645 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.312631 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b"} err="failed to get container status \"a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b\": rpc error: code = NotFound desc = could not find container \"a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b\": container with ID starting with a98c23f241a77ce25079b7f32a3b4ae1b94f928e3e1a0da72bb94665bb4ddb4b not found: ID does not exist" Apr 23 01:13:50.338488 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338470 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:13:50.338816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338759 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy-web" Apr 23 01:13:50.338816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338778 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy-web" Apr 23 01:13:50.338816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338794 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy-metric" Apr 23 01:13:50.338816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338801 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy-metric" Apr 23 01:13:50.338816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338810 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="prom-label-proxy" Apr 23 01:13:50.338816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338819 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="prom-label-proxy" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338830 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="alertmanager" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338837 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="alertmanager" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338853 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="init-config-reloader" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338861 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="init-config-reloader" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338869 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="config-reloader" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338876 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="config-reloader" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338885 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338893 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338946 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="alertmanager" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338957 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="config-reloader" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.338965 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="prom-label-proxy" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.339001 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy-web" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.339013 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy-metric" Apr 23 01:13:50.339584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.339021 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" containerName="kube-rbac-proxy" Apr 23 01:13:50.343908 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.343894 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.346618 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.346599 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 01:13:50.346855 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.346835 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hrrqc\"" Apr 23 01:13:50.346952 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.346857 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 01:13:50.346952 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.346869 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 01:13:50.346952 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.346878 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 01:13:50.347178 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.347163 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 01:13:50.347374 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.347360 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 01:13:50.348254 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.348239 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 01:13:50.348337 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.348269 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 01:13:50.351988 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.351952 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 01:13:50.353557 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.353538 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:13:50.496601 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496578 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-config-volume\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496695 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496611 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-web-config\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496695 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496626 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3259f911-f547-46dd-8b53-171fdc52f63a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496695 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496664 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496695 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496689 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3259f911-f547-46dd-8b53-171fdc52f63a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496709 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496737 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496758 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3259f911-f547-46dd-8b53-171fdc52f63a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496774 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3259f911-f547-46dd-8b53-171fdc52f63a-config-out\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496816 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496792 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496964 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496871 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496964 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496905 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdbpk\" (UniqueName: \"kubernetes.io/projected/3259f911-f547-46dd-8b53-171fdc52f63a-kube-api-access-vdbpk\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.496964 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.496924 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3259f911-f547-46dd-8b53-171fdc52f63a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.597475 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597428 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-web-config\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.597475 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597453 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3259f911-f547-46dd-8b53-171fdc52f63a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.597579 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597476 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.597579 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597492 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3259f911-f547-46dd-8b53-171fdc52f63a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.597579 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597508 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.597684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597627 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.597684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597651 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3259f911-f547-46dd-8b53-171fdc52f63a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.597684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597677 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3259f911-f547-46dd-8b53-171fdc52f63a-config-out\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.598548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597705 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.598548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597762 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.598548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597793 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdbpk\" (UniqueName: \"kubernetes.io/projected/3259f911-f547-46dd-8b53-171fdc52f63a-kube-api-access-vdbpk\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.598548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597823 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3259f911-f547-46dd-8b53-171fdc52f63a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.598548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.597862 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-config-volume\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.598548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.598243 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3259f911-f547-46dd-8b53-171fdc52f63a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.598548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.598524 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3259f911-f547-46dd-8b53-171fdc52f63a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.600620 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.600516 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.600620 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.600523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3259f911-f547-46dd-8b53-171fdc52f63a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.600774 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.600724 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.600835 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.600792 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-config-volume\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.600835 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.600818 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-web-config\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.600947 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.600923 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.601079 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.601060 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3259f911-f547-46dd-8b53-171fdc52f63a-config-out\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.601138 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.601088 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3259f911-f547-46dd-8b53-171fdc52f63a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.601178 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.601132 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.602607 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.602591 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3259f911-f547-46dd-8b53-171fdc52f63a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.606133 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.606115 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdbpk\" (UniqueName: \"kubernetes.io/projected/3259f911-f547-46dd-8b53-171fdc52f63a-kube-api-access-vdbpk\") pod \"alertmanager-main-0\" (UID: \"3259f911-f547-46dd-8b53-171fdc52f63a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.653185 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.653163 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 01:13:50.777299 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:50.777277 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 01:13:50.779181 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:13:50.779155 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3259f911_f547_46dd_8b53_171fdc52f63a.slice/crio-02893cd619dd3fbb3306e761d008477a3fa1def6771446e215e223675456082e WatchSource:0}: Error finding container 02893cd619dd3fbb3306e761d008477a3fa1def6771446e215e223675456082e: Status 404 returned error can't find the container with id 02893cd619dd3fbb3306e761d008477a3fa1def6771446e215e223675456082e Apr 23 01:13:51.257966 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:51.257929 2565 generic.go:358] "Generic (PLEG): container finished" podID="3259f911-f547-46dd-8b53-171fdc52f63a" containerID="1adc843c13fad8fae12dfc849586eba90a218b600d1fd900e3c1d50a6ee525a3" exitCode=0 Apr 23 01:13:51.258311 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:51.258021 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3259f911-f547-46dd-8b53-171fdc52f63a","Type":"ContainerDied","Data":"1adc843c13fad8fae12dfc849586eba90a218b600d1fd900e3c1d50a6ee525a3"} Apr 23 01:13:51.258311 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:51.258051 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3259f911-f547-46dd-8b53-171fdc52f63a","Type":"ContainerStarted","Data":"02893cd619dd3fbb3306e761d008477a3fa1def6771446e215e223675456082e"} Apr 23 01:13:51.560224 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:51.560149 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b34105-0f0d-47b8-9a52-6f45a6e31657" path="/var/lib/kubelet/pods/16b34105-0f0d-47b8-9a52-6f45a6e31657/volumes" Apr 23 01:13:52.264077 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:52.264045 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3259f911-f547-46dd-8b53-171fdc52f63a","Type":"ContainerStarted","Data":"4ca586d44f6ec3dc8a5f4e80c41004c43641c0bccf8b07218d26be25a2dc9639"} Apr 23 01:13:52.264416 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:52.264081 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3259f911-f547-46dd-8b53-171fdc52f63a","Type":"ContainerStarted","Data":"1405f39eaaacbb21257e9340eb9106e0da4d27515f9601403eb60d07c2edff9f"} Apr 23 01:13:52.264416 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:52.264095 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3259f911-f547-46dd-8b53-171fdc52f63a","Type":"ContainerStarted","Data":"572fa61e98e89b83faff279e57661f8749f14959fd7d7045ae9193dcbd20fe4b"} Apr 23 01:13:52.264416 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:52.264105 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3259f911-f547-46dd-8b53-171fdc52f63a","Type":"ContainerStarted","Data":"9cc1bca37c680dc4fd641f651e7a0fd54ec98e3fc25c97212cc46d8aa177856d"} Apr 23 01:13:52.264416 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:52.264116 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3259f911-f547-46dd-8b53-171fdc52f63a","Type":"ContainerStarted","Data":"6471c67fd843195a68f33e13003e22f7c5efaae9262a9e77edc55ccd490b46fd"} Apr 23 01:13:52.264416 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:52.264128 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3259f911-f547-46dd-8b53-171fdc52f63a","Type":"ContainerStarted","Data":"b194271e656626afe8bf4dfe23a969329a30d4b15a96126e6e54e6d345315f30"} Apr 23 01:13:52.289743 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:52.289697 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.2896851910000002 podStartE2EDuration="2.289685191s" podCreationTimestamp="2026-04-23 01:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:13:52.287828351 +0000 UTC m=+219.247681811" watchObservedRunningTime="2026-04-23 01:13:52.289685191 +0000 UTC m=+219.249538654" Apr 23 01:13:58.355348 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:58.355315 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:58.355777 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:58.355359 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:58.359717 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:58.359698 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:13:59.288593 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:13:59.288564 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:14:41.516661 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.516587 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qvzx4"] Apr 23 01:14:41.520624 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.520607 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.523290 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.523269 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 01:14:41.525793 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.525772 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qvzx4"] Apr 23 01:14:41.625155 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.625133 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5b97054-a7d6-400f-9526-e28785d300ef-original-pull-secret\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.625259 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.625173 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5b97054-a7d6-400f-9526-e28785d300ef-dbus\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.625259 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.625198 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5b97054-a7d6-400f-9526-e28785d300ef-kubelet-config\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.725639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.725612 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5b97054-a7d6-400f-9526-e28785d300ef-dbus\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.725721 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.725649 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5b97054-a7d6-400f-9526-e28785d300ef-kubelet-config\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.725721 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.725682 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5b97054-a7d6-400f-9526-e28785d300ef-original-pull-secret\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.725804 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.725749 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c5b97054-a7d6-400f-9526-e28785d300ef-kubelet-config\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.725804 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.725776 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c5b97054-a7d6-400f-9526-e28785d300ef-dbus\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.727911 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.727893 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c5b97054-a7d6-400f-9526-e28785d300ef-original-pull-secret\") pod \"global-pull-secret-syncer-qvzx4\" (UID: \"c5b97054-a7d6-400f-9526-e28785d300ef\") " pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.830846 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.830790 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qvzx4" Apr 23 01:14:41.941293 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:41.941268 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qvzx4"] Apr 23 01:14:41.943766 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:14:41.943738 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5b97054_a7d6_400f_9526_e28785d300ef.slice/crio-3c1c106e24fb37c6c9ee33dcb55e6a102a257e855f5e7b4389c5fc83bafb73e6 WatchSource:0}: Error finding container 3c1c106e24fb37c6c9ee33dcb55e6a102a257e855f5e7b4389c5fc83bafb73e6: Status 404 returned error can't find the container with id 3c1c106e24fb37c6c9ee33dcb55e6a102a257e855f5e7b4389c5fc83bafb73e6 Apr 23 01:14:42.400737 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:42.400703 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qvzx4" event={"ID":"c5b97054-a7d6-400f-9526-e28785d300ef","Type":"ContainerStarted","Data":"3c1c106e24fb37c6c9ee33dcb55e6a102a257e855f5e7b4389c5fc83bafb73e6"} Apr 23 01:14:46.413550 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:46.413513 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qvzx4" event={"ID":"c5b97054-a7d6-400f-9526-e28785d300ef","Type":"ContainerStarted","Data":"602ec6f52852bc0cddba1474cee608f9a5007fead84018b824eb1fcfdfcd895c"} Apr 23 01:14:46.428745 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:46.428701 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qvzx4" podStartSLOduration=1.315130017 podStartE2EDuration="5.428687672s" podCreationTimestamp="2026-04-23 01:14:41 +0000 UTC" firstStartedPulling="2026-04-23 01:14:41.945561166 +0000 UTC m=+268.905414605" lastFinishedPulling="2026-04-23 01:14:46.05911882 +0000 UTC m=+273.018972260" observedRunningTime="2026-04-23 01:14:46.427293796 +0000 UTC m=+273.387147261" watchObservedRunningTime="2026-04-23 01:14:46.428687672 +0000 UTC m=+273.388541152" Apr 23 01:14:54.057886 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.057854 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss"] Apr 23 01:14:54.061234 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.061219 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.063875 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.063858 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 01:14:54.063965 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.063881 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mpphq\"" Apr 23 01:14:54.065278 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.065259 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 01:14:54.067290 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.067266 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss"] Apr 23 01:14:54.109510 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.109488 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.109638 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.109538 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.109638 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.109571 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxfv\" (UniqueName: \"kubernetes.io/projected/d22bcf28-a258-491c-b1b3-cf1e0c47de11-kube-api-access-kgxfv\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.210785 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.210762 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.210860 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.210807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.210860 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.210827 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxfv\" (UniqueName: \"kubernetes.io/projected/d22bcf28-a258-491c-b1b3-cf1e0c47de11-kube-api-access-kgxfv\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.211224 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.211202 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.211267 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.211226 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.218652 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.218626 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxfv\" (UniqueName: \"kubernetes.io/projected/d22bcf28-a258-491c-b1b3-cf1e0c47de11-kube-api-access-kgxfv\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.370722 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.370672 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:14:54.482353 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:54.482324 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss"] Apr 23 01:14:54.485998 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:14:54.485959 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd22bcf28_a258_491c_b1b3_cf1e0c47de11.slice/crio-2771ee36dd5d9ea69c46399ca565bce4f6494bd05009fe85865f26d9e24754cc WatchSource:0}: Error finding container 2771ee36dd5d9ea69c46399ca565bce4f6494bd05009fe85865f26d9e24754cc: Status 404 returned error can't find the container with id 2771ee36dd5d9ea69c46399ca565bce4f6494bd05009fe85865f26d9e24754cc Apr 23 01:14:55.448022 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:14:55.447965 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" event={"ID":"d22bcf28-a258-491c-b1b3-cf1e0c47de11","Type":"ContainerStarted","Data":"2771ee36dd5d9ea69c46399ca565bce4f6494bd05009fe85865f26d9e24754cc"} Apr 23 01:15:01.465198 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:01.465163 2565 generic.go:358] "Generic (PLEG): container finished" podID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerID="fec7a56b0304e4320293e91560325f4e71c5b44df721c1fff158050151828148" exitCode=0 Apr 23 01:15:01.465641 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:01.465212 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" event={"ID":"d22bcf28-a258-491c-b1b3-cf1e0c47de11","Type":"ContainerDied","Data":"fec7a56b0304e4320293e91560325f4e71c5b44df721c1fff158050151828148"} Apr 23 01:15:04.474580 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:04.474546 2565 generic.go:358] "Generic (PLEG): container finished" podID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerID="b3e5e7d22651ace3fec7fb892fbadf3c0418a2dce643c4539b70036288c1b12f" exitCode=0 Apr 23 01:15:04.474885 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:04.474609 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" event={"ID":"d22bcf28-a258-491c-b1b3-cf1e0c47de11","Type":"ContainerDied","Data":"b3e5e7d22651ace3fec7fb892fbadf3c0418a2dce643c4539b70036288c1b12f"} Apr 23 01:15:11.498143 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:11.498084 2565 generic.go:358] "Generic (PLEG): container finished" podID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerID="c5d7ab6bc28974bb68f25b632d635f3bcb88c6c8b14c3070a5ab64857e8dc372" exitCode=0 Apr 23 01:15:11.498143 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:11.498123 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" event={"ID":"d22bcf28-a258-491c-b1b3-cf1e0c47de11","Type":"ContainerDied","Data":"c5d7ab6bc28974bb68f25b632d635f3bcb88c6c8b14c3070a5ab64857e8dc372"} Apr 23 01:15:12.619246 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.619224 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:15:12.763437 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.763379 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgxfv\" (UniqueName: \"kubernetes.io/projected/d22bcf28-a258-491c-b1b3-cf1e0c47de11-kube-api-access-kgxfv\") pod \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " Apr 23 01:15:12.763437 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.763411 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-bundle\") pod \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " Apr 23 01:15:12.763437 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.763431 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-util\") pod \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\" (UID: \"d22bcf28-a258-491c-b1b3-cf1e0c47de11\") " Apr 23 01:15:12.764099 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.764076 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-bundle" (OuterVolumeSpecName: "bundle") pod "d22bcf28-a258-491c-b1b3-cf1e0c47de11" (UID: "d22bcf28-a258-491c-b1b3-cf1e0c47de11"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:15:12.765286 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.765264 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22bcf28-a258-491c-b1b3-cf1e0c47de11-kube-api-access-kgxfv" (OuterVolumeSpecName: "kube-api-access-kgxfv") pod "d22bcf28-a258-491c-b1b3-cf1e0c47de11" (UID: "d22bcf28-a258-491c-b1b3-cf1e0c47de11"). InnerVolumeSpecName "kube-api-access-kgxfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:15:12.767339 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.767316 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-util" (OuterVolumeSpecName: "util") pod "d22bcf28-a258-491c-b1b3-cf1e0c47de11" (UID: "d22bcf28-a258-491c-b1b3-cf1e0c47de11"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:15:12.864704 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.864680 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kgxfv\" (UniqueName: \"kubernetes.io/projected/d22bcf28-a258-491c-b1b3-cf1e0c47de11-kube-api-access-kgxfv\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:12.864704 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.864701 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:12.864817 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:12.864715 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d22bcf28-a258-491c-b1b3-cf1e0c47de11-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:13.473289 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:13.473263 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 01:15:13.504837 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:13.504718 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" event={"ID":"d22bcf28-a258-491c-b1b3-cf1e0c47de11","Type":"ContainerDied","Data":"2771ee36dd5d9ea69c46399ca565bce4f6494bd05009fe85865f26d9e24754cc"} Apr 23 01:15:13.504837 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:13.504748 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2771ee36dd5d9ea69c46399ca565bce4f6494bd05009fe85865f26d9e24754cc" Apr 23 01:15:13.504837 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:13.504779 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d87sss" Apr 23 01:15:21.429895 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.429866 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92"] Apr 23 01:15:21.430253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.430124 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerName="pull" Apr 23 01:15:21.430253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.430136 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerName="pull" Apr 23 01:15:21.430253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.430150 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerName="extract" Apr 23 01:15:21.430253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.430156 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerName="extract" Apr 23 01:15:21.430253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.430165 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerName="util" Apr 23 01:15:21.430253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.430170 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerName="util" Apr 23 01:15:21.430253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.430212 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d22bcf28-a258-491c-b1b3-cf1e0c47de11" containerName="extract" Apr 23 01:15:21.433784 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.433768 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.436417 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.436396 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mpphq\"" Apr 23 01:15:21.436521 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.436396 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 01:15:21.437926 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.437906 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 01:15:21.439708 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.439687 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92"] Apr 23 01:15:21.524109 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.524083 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.524214 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.524139 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcgql\" (UniqueName: \"kubernetes.io/projected/d33c9938-0203-44e2-bee8-07e931516403-kube-api-access-jcgql\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.524277 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.524211 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.625017 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.624994 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcgql\" (UniqueName: \"kubernetes.io/projected/d33c9938-0203-44e2-bee8-07e931516403-kube-api-access-jcgql\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.625107 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.625031 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.625107 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.625076 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.625451 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.625431 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.625491 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.625463 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.638656 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.638631 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcgql\" (UniqueName: \"kubernetes.io/projected/d33c9938-0203-44e2-bee8-07e931516403-kube-api-access-jcgql\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.743206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.743159 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:21.856578 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.856551 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92"] Apr 23 01:15:21.860039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:15:21.860006 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd33c9938_0203_44e2_bee8_07e931516403.slice/crio-7f8db5d6e252041b125b9a628931ae74c4b1387e0c4152fb0e95bee5c89d0a3e WatchSource:0}: Error finding container 7f8db5d6e252041b125b9a628931ae74c4b1387e0c4152fb0e95bee5c89d0a3e: Status 404 returned error can't find the container with id 7f8db5d6e252041b125b9a628931ae74c4b1387e0c4152fb0e95bee5c89d0a3e Apr 23 01:15:21.861885 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:21.861859 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:15:22.529700 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:22.529664 2565 generic.go:358] "Generic (PLEG): container finished" podID="d33c9938-0203-44e2-bee8-07e931516403" containerID="28947d99c6870781fd5e99568067abab0564f646b610ecaace3b9aaa8a994d82" exitCode=0 Apr 23 01:15:22.530121 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:22.529704 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" event={"ID":"d33c9938-0203-44e2-bee8-07e931516403","Type":"ContainerDied","Data":"28947d99c6870781fd5e99568067abab0564f646b610ecaace3b9aaa8a994d82"} Apr 23 01:15:22.530121 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:22.529725 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" event={"ID":"d33c9938-0203-44e2-bee8-07e931516403","Type":"ContainerStarted","Data":"7f8db5d6e252041b125b9a628931ae74c4b1387e0c4152fb0e95bee5c89d0a3e"} Apr 23 01:15:23.375877 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.375848 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-cvdcl"] Apr 23 01:15:23.379235 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.379213 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:23.382036 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.382016 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 01:15:23.382113 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.382016 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-5zxms\"" Apr 23 01:15:23.383285 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.383266 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 01:15:23.390040 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.390005 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-cvdcl"] Apr 23 01:15:23.540187 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.540158 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e44f8cc-c52f-4311-863a-73b77083a030-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-cvdcl\" (UID: \"4e44f8cc-c52f-4311-863a-73b77083a030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:23.540490 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.540211 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn2xz\" (UniqueName: \"kubernetes.io/projected/4e44f8cc-c52f-4311-863a-73b77083a030-kube-api-access-bn2xz\") pod \"cert-manager-webhook-587ccfb98-cvdcl\" (UID: \"4e44f8cc-c52f-4311-863a-73b77083a030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:23.641245 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.641221 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn2xz\" (UniqueName: \"kubernetes.io/projected/4e44f8cc-c52f-4311-863a-73b77083a030-kube-api-access-bn2xz\") pod \"cert-manager-webhook-587ccfb98-cvdcl\" (UID: \"4e44f8cc-c52f-4311-863a-73b77083a030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:23.641478 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.641454 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e44f8cc-c52f-4311-863a-73b77083a030-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-cvdcl\" (UID: \"4e44f8cc-c52f-4311-863a-73b77083a030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:23.648808 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.648782 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e44f8cc-c52f-4311-863a-73b77083a030-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-cvdcl\" (UID: \"4e44f8cc-c52f-4311-863a-73b77083a030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:23.648889 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.648783 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn2xz\" (UniqueName: \"kubernetes.io/projected/4e44f8cc-c52f-4311-863a-73b77083a030-kube-api-access-bn2xz\") pod \"cert-manager-webhook-587ccfb98-cvdcl\" (UID: \"4e44f8cc-c52f-4311-863a-73b77083a030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:23.690750 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.690729 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:23.807080 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:23.807047 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-cvdcl"] Apr 23 01:15:23.810212 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:15:23.810180 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e44f8cc_c52f_4311_863a_73b77083a030.slice/crio-6b65299db52443d41fb78bfa3f855e79fb373b4ffb95239819b4bf3a4b79c7f8 WatchSource:0}: Error finding container 6b65299db52443d41fb78bfa3f855e79fb373b4ffb95239819b4bf3a4b79c7f8: Status 404 returned error can't find the container with id 6b65299db52443d41fb78bfa3f855e79fb373b4ffb95239819b4bf3a4b79c7f8 Apr 23 01:15:24.536695 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:24.536662 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" event={"ID":"4e44f8cc-c52f-4311-863a-73b77083a030","Type":"ContainerStarted","Data":"6b65299db52443d41fb78bfa3f855e79fb373b4ffb95239819b4bf3a4b79c7f8"} Apr 23 01:15:25.541131 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:25.541094 2565 generic.go:358] "Generic (PLEG): container finished" podID="d33c9938-0203-44e2-bee8-07e931516403" containerID="a8b5f07484b4f49fd7ba173348de9ca709576ddd2013803dccfe1032eb3eed92" exitCode=0 Apr 23 01:15:25.541505 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:25.541176 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" event={"ID":"d33c9938-0203-44e2-bee8-07e931516403","Type":"ContainerDied","Data":"a8b5f07484b4f49fd7ba173348de9ca709576ddd2013803dccfe1032eb3eed92"} Apr 23 01:15:26.546432 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:26.546393 2565 generic.go:358] "Generic (PLEG): container finished" podID="d33c9938-0203-44e2-bee8-07e931516403" containerID="2d5adaabebe05890a4a0893a5159f7c30f8eb4fbbcfdfea76e4238a3212f1f13" exitCode=0 Apr 23 01:15:26.546839 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:26.546479 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" event={"ID":"d33c9938-0203-44e2-bee8-07e931516403","Type":"ContainerDied","Data":"2d5adaabebe05890a4a0893a5159f7c30f8eb4fbbcfdfea76e4238a3212f1f13"} Apr 23 01:15:27.550966 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.550935 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" event={"ID":"4e44f8cc-c52f-4311-863a-73b77083a030","Type":"ContainerStarted","Data":"995ea246c743bdc5f23c719ad6032fe475c5830cfc09abf1d95d5b60eb0cac42"} Apr 23 01:15:27.551404 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.551043 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:27.566604 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.566555 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" podStartSLOduration=1.313116476 podStartE2EDuration="4.56654209s" podCreationTimestamp="2026-04-23 01:15:23 +0000 UTC" firstStartedPulling="2026-04-23 01:15:23.812645852 +0000 UTC m=+310.772499295" lastFinishedPulling="2026-04-23 01:15:27.066071471 +0000 UTC m=+314.025924909" observedRunningTime="2026-04-23 01:15:27.565734347 +0000 UTC m=+314.525587806" watchObservedRunningTime="2026-04-23 01:15:27.56654209 +0000 UTC m=+314.526395548" Apr 23 01:15:27.667795 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.667773 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:27.774906 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.774882 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-util\") pod \"d33c9938-0203-44e2-bee8-07e931516403\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " Apr 23 01:15:27.775025 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.774939 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcgql\" (UniqueName: \"kubernetes.io/projected/d33c9938-0203-44e2-bee8-07e931516403-kube-api-access-jcgql\") pod \"d33c9938-0203-44e2-bee8-07e931516403\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " Apr 23 01:15:27.775025 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.775014 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-bundle\") pod \"d33c9938-0203-44e2-bee8-07e931516403\" (UID: \"d33c9938-0203-44e2-bee8-07e931516403\") " Apr 23 01:15:27.775362 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.775337 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-bundle" (OuterVolumeSpecName: "bundle") pod "d33c9938-0203-44e2-bee8-07e931516403" (UID: "d33c9938-0203-44e2-bee8-07e931516403"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:15:27.776779 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.776761 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33c9938-0203-44e2-bee8-07e931516403-kube-api-access-jcgql" (OuterVolumeSpecName: "kube-api-access-jcgql") pod "d33c9938-0203-44e2-bee8-07e931516403" (UID: "d33c9938-0203-44e2-bee8-07e931516403"). InnerVolumeSpecName "kube-api-access-jcgql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:15:27.780750 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.780728 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-util" (OuterVolumeSpecName: "util") pod "d33c9938-0203-44e2-bee8-07e931516403" (UID: "d33c9938-0203-44e2-bee8-07e931516403"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:15:27.875534 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.875482 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:27.875534 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.875509 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d33c9938-0203-44e2-bee8-07e931516403-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:27.875534 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:27.875518 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jcgql\" (UniqueName: \"kubernetes.io/projected/d33c9938-0203-44e2-bee8-07e931516403-kube-api-access-jcgql\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:28.554836 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:28.554801 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" event={"ID":"d33c9938-0203-44e2-bee8-07e931516403","Type":"ContainerDied","Data":"7f8db5d6e252041b125b9a628931ae74c4b1387e0c4152fb0e95bee5c89d0a3e"} Apr 23 01:15:28.554836 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:28.554835 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgmm92" Apr 23 01:15:28.555250 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:28.554836 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f8db5d6e252041b125b9a628931ae74c4b1387e0c4152fb0e95bee5c89d0a3e" Apr 23 01:15:33.559392 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:33.559367 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-cvdcl" Apr 23 01:15:42.013029 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.012998 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb"] Apr 23 01:15:42.013348 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.013239 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d33c9938-0203-44e2-bee8-07e931516403" containerName="extract" Apr 23 01:15:42.013348 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.013249 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33c9938-0203-44e2-bee8-07e931516403" containerName="extract" Apr 23 01:15:42.013348 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.013265 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d33c9938-0203-44e2-bee8-07e931516403" containerName="util" Apr 23 01:15:42.013348 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.013271 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33c9938-0203-44e2-bee8-07e931516403" containerName="util" Apr 23 01:15:42.013348 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.013284 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d33c9938-0203-44e2-bee8-07e931516403" containerName="pull" Apr 23 01:15:42.013348 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.013288 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33c9938-0203-44e2-bee8-07e931516403" containerName="pull" Apr 23 01:15:42.013348 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.013327 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d33c9938-0203-44e2-bee8-07e931516403" containerName="extract" Apr 23 01:15:42.034252 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.034226 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb"] Apr 23 01:15:42.034398 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.034375 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.037446 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.037424 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mpphq\"" Apr 23 01:15:42.037562 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.037485 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 01:15:42.038442 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.038428 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 01:15:42.065704 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.065680 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.065806 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.065720 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.065806 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.065744 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhgmd\" (UniqueName: \"kubernetes.io/projected/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-kube-api-access-rhgmd\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.166670 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.166646 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.166746 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.166688 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.166746 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.166706 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhgmd\" (UniqueName: \"kubernetes.io/projected/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-kube-api-access-rhgmd\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.167033 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.167017 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.167101 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.167083 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.174514 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.174488 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhgmd\" (UniqueName: \"kubernetes.io/projected/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-kube-api-access-rhgmd\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.344019 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.343955 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:42.461295 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.461146 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb"] Apr 23 01:15:42.463614 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:15:42.463584 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703f7bb5_e9df_4ef2_a968_b685eb97d1e0.slice/crio-8ebb577668859347c36ab1dc2696e80d451bb39165fcc1f53b10ad60de10d8ff WatchSource:0}: Error finding container 8ebb577668859347c36ab1dc2696e80d451bb39165fcc1f53b10ad60de10d8ff: Status 404 returned error can't find the container with id 8ebb577668859347c36ab1dc2696e80d451bb39165fcc1f53b10ad60de10d8ff Apr 23 01:15:42.593253 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.593222 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" event={"ID":"703f7bb5-e9df-4ef2-a968-b685eb97d1e0","Type":"ContainerStarted","Data":"872d78fa8842f996ae0b8a274324c74ddd9e25f64a5b444aea714b4946c170f7"} Apr 23 01:15:42.593359 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:42.593260 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" event={"ID":"703f7bb5-e9df-4ef2-a968-b685eb97d1e0","Type":"ContainerStarted","Data":"8ebb577668859347c36ab1dc2696e80d451bb39165fcc1f53b10ad60de10d8ff"} Apr 23 01:15:43.596305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:43.596234 2565 generic.go:358] "Generic (PLEG): container finished" podID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerID="872d78fa8842f996ae0b8a274324c74ddd9e25f64a5b444aea714b4946c170f7" exitCode=0 Apr 23 01:15:43.596713 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:43.596304 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" event={"ID":"703f7bb5-e9df-4ef2-a968-b685eb97d1e0","Type":"ContainerDied","Data":"872d78fa8842f996ae0b8a274324c74ddd9e25f64a5b444aea714b4946c170f7"} Apr 23 01:15:44.601272 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:44.601191 2565 generic.go:358] "Generic (PLEG): container finished" podID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerID="ff91d55eb94f658f85b16d647199f166608c27a15ea03dd203a38d31e77a8500" exitCode=0 Apr 23 01:15:44.601598 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:44.601277 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" event={"ID":"703f7bb5-e9df-4ef2-a968-b685eb97d1e0","Type":"ContainerDied","Data":"ff91d55eb94f658f85b16d647199f166608c27a15ea03dd203a38d31e77a8500"} Apr 23 01:15:45.605703 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:45.605666 2565 generic.go:358] "Generic (PLEG): container finished" podID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerID="06d969083e66eb596e5d3067502b3c7766aa32818383003ff0d5b84daa0a401c" exitCode=0 Apr 23 01:15:45.606078 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:45.605718 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" event={"ID":"703f7bb5-e9df-4ef2-a968-b685eb97d1e0","Type":"ContainerDied","Data":"06d969083e66eb596e5d3067502b3c7766aa32818383003ff0d5b84daa0a401c"} Apr 23 01:15:46.724855 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.724833 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:46.799451 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.799423 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-util\") pod \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " Apr 23 01:15:46.799561 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.799455 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-bundle\") pod \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " Apr 23 01:15:46.799561 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.799550 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhgmd\" (UniqueName: \"kubernetes.io/projected/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-kube-api-access-rhgmd\") pod \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\" (UID: \"703f7bb5-e9df-4ef2-a968-b685eb97d1e0\") " Apr 23 01:15:46.800218 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.800192 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-bundle" (OuterVolumeSpecName: "bundle") pod "703f7bb5-e9df-4ef2-a968-b685eb97d1e0" (UID: "703f7bb5-e9df-4ef2-a968-b685eb97d1e0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:15:46.801577 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.801550 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-kube-api-access-rhgmd" (OuterVolumeSpecName: "kube-api-access-rhgmd") pod "703f7bb5-e9df-4ef2-a968-b685eb97d1e0" (UID: "703f7bb5-e9df-4ef2-a968-b685eb97d1e0"). InnerVolumeSpecName "kube-api-access-rhgmd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:15:46.805101 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.805051 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-util" (OuterVolumeSpecName: "util") pod "703f7bb5-e9df-4ef2-a968-b685eb97d1e0" (UID: "703f7bb5-e9df-4ef2-a968-b685eb97d1e0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:15:46.900587 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.900564 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rhgmd\" (UniqueName: \"kubernetes.io/projected/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-kube-api-access-rhgmd\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:46.900587 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.900585 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:46.900710 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:46.900595 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703f7bb5-e9df-4ef2-a968-b685eb97d1e0-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:47.614713 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:47.614680 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" event={"ID":"703f7bb5-e9df-4ef2-a968-b685eb97d1e0","Type":"ContainerDied","Data":"8ebb577668859347c36ab1dc2696e80d451bb39165fcc1f53b10ad60de10d8ff"} Apr 23 01:15:47.614713 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:47.614699 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54f2bb" Apr 23 01:15:47.614713 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:47.614714 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ebb577668859347c36ab1dc2696e80d451bb39165fcc1f53b10ad60de10d8ff" Apr 23 01:15:51.835938 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.835897 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v"] Apr 23 01:15:51.836490 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.836470 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerName="pull" Apr 23 01:15:51.836562 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.836494 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerName="pull" Apr 23 01:15:51.836562 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.836516 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerName="util" Apr 23 01:15:51.836562 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.836524 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerName="util" Apr 23 01:15:51.836562 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.836547 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerName="extract" Apr 23 01:15:51.836562 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.836556 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerName="extract" Apr 23 01:15:51.836789 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.836656 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="703f7bb5-e9df-4ef2-a968-b685eb97d1e0" containerName="extract" Apr 23 01:15:51.841790 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.841767 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:51.844066 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.844044 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v"] Apr 23 01:15:51.844744 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.844694 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 01:15:51.844744 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.844694 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 01:15:51.844941 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.844768 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mpphq\"" Apr 23 01:15:51.938672 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.938648 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkdzx\" (UniqueName: \"kubernetes.io/projected/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-kube-api-access-zkdzx\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:51.938770 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.938688 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:51.938770 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:51.938724 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:52.039947 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.039920 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:52.040049 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.039962 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:52.040049 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.040010 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkdzx\" (UniqueName: \"kubernetes.io/projected/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-kube-api-access-zkdzx\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:52.040293 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.040277 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:52.040330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.040302 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:52.048108 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.048091 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkdzx\" (UniqueName: \"kubernetes.io/projected/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-kube-api-access-zkdzx\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:52.151675 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.151655 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:52.269553 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.269515 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v"] Apr 23 01:15:52.273427 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:15:52.273401 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6467e5b_38c7_4fb1_bc5a_6cbca50379d3.slice/crio-000bfd8c32e1bb89307d9aeaa500dc68dd2495c287c5eb8531dd5dfc0db44b51 WatchSource:0}: Error finding container 000bfd8c32e1bb89307d9aeaa500dc68dd2495c287c5eb8531dd5dfc0db44b51: Status 404 returned error can't find the container with id 000bfd8c32e1bb89307d9aeaa500dc68dd2495c287c5eb8531dd5dfc0db44b51 Apr 23 01:15:52.631402 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.631332 2565 generic.go:358] "Generic (PLEG): container finished" podID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerID="b420c99412aea0c2b5d70122bf7965310dde5644fcd1bad098a59a66a752ad55" exitCode=0 Apr 23 01:15:52.631502 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.631413 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" event={"ID":"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3","Type":"ContainerDied","Data":"b420c99412aea0c2b5d70122bf7965310dde5644fcd1bad098a59a66a752ad55"} Apr 23 01:15:52.631502 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:52.631444 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" event={"ID":"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3","Type":"ContainerStarted","Data":"000bfd8c32e1bb89307d9aeaa500dc68dd2495c287c5eb8531dd5dfc0db44b51"} Apr 23 01:15:53.634895 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.634822 2565 generic.go:358] "Generic (PLEG): container finished" podID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerID="0f85e4e8cc5eb2b08af1bbb056b501dce753a7d4c2d48b16d8a81643256e4b81" exitCode=0 Apr 23 01:15:53.634895 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.634887 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" event={"ID":"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3","Type":"ContainerDied","Data":"0f85e4e8cc5eb2b08af1bbb056b501dce753a7d4c2d48b16d8a81643256e4b81"} Apr 23 01:15:53.658462 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.658441 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88"] Apr 23 01:15:53.662515 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.662501 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.665640 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.665621 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-hmfjw\"" Apr 23 01:15:53.665970 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.665955 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 23 01:15:53.666258 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.666243 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 23 01:15:53.666919 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.666901 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 23 01:15:53.667025 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.666908 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 23 01:15:53.679676 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.679654 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88"] Apr 23 01:15:53.750999 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.750951 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.751123 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.751013 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8j2j\" (UniqueName: \"kubernetes.io/projected/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-kube-api-access-w8j2j\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.751123 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.751086 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.851562 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.851530 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.851668 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.851597 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.851668 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.851625 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8j2j\" (UniqueName: \"kubernetes.io/projected/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-kube-api-access-w8j2j\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.853781 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.853752 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.853781 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.853767 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.864908 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.864886 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8j2j\" (UniqueName: \"kubernetes.io/projected/d9c99aae-b547-4c5d-b3cc-648cbdd109b2-kube-api-access-w8j2j\") pod \"opendatahub-operator-controller-manager-5fb5768b86-cxw88\" (UID: \"d9c99aae-b547-4c5d-b3cc-648cbdd109b2\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:53.992280 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:53.992259 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:54.110654 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:54.110630 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88"] Apr 23 01:15:54.113421 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:15:54.113393 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c99aae_b547_4c5d_b3cc_648cbdd109b2.slice/crio-22cc61fc137ae0809d19a2ff3aeab10bc018fccb4c101ba44566672f76934055 WatchSource:0}: Error finding container 22cc61fc137ae0809d19a2ff3aeab10bc018fccb4c101ba44566672f76934055: Status 404 returned error can't find the container with id 22cc61fc137ae0809d19a2ff3aeab10bc018fccb4c101ba44566672f76934055 Apr 23 01:15:54.639991 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:54.639943 2565 generic.go:358] "Generic (PLEG): container finished" podID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerID="223de65667a4584640dad22bd831fe55e8a75cfef55124f2a570397a2f5043d9" exitCode=0 Apr 23 01:15:54.640384 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:54.640015 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" event={"ID":"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3","Type":"ContainerDied","Data":"223de65667a4584640dad22bd831fe55e8a75cfef55124f2a570397a2f5043d9"} Apr 23 01:15:54.641111 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:54.641091 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" event={"ID":"d9c99aae-b547-4c5d-b3cc-648cbdd109b2","Type":"ContainerStarted","Data":"22cc61fc137ae0809d19a2ff3aeab10bc018fccb4c101ba44566672f76934055"} Apr 23 01:15:56.692481 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.692458 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:56.775997 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.775877 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-util\") pod \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " Apr 23 01:15:56.775997 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.775917 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkdzx\" (UniqueName: \"kubernetes.io/projected/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-kube-api-access-zkdzx\") pod \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " Apr 23 01:15:56.776120 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.776013 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-bundle\") pod \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\" (UID: \"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3\") " Apr 23 01:15:56.777297 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.777265 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-bundle" (OuterVolumeSpecName: "bundle") pod "e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" (UID: "e6467e5b-38c7-4fb1-bc5a-6cbca50379d3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:15:56.777798 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.777773 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-kube-api-access-zkdzx" (OuterVolumeSpecName: "kube-api-access-zkdzx") pod "e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" (UID: "e6467e5b-38c7-4fb1-bc5a-6cbca50379d3"). InnerVolumeSpecName "kube-api-access-zkdzx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:15:56.781041 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.781017 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-util" (OuterVolumeSpecName: "util") pod "e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" (UID: "e6467e5b-38c7-4fb1-bc5a-6cbca50379d3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:15:56.877305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.877261 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:56.877305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.877302 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:56.877305 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:56.877313 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkdzx\" (UniqueName: \"kubernetes.io/projected/e6467e5b-38c7-4fb1-bc5a-6cbca50379d3-kube-api-access-zkdzx\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:15:57.653583 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:57.653546 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" event={"ID":"e6467e5b-38c7-4fb1-bc5a-6cbca50379d3","Type":"ContainerDied","Data":"000bfd8c32e1bb89307d9aeaa500dc68dd2495c287c5eb8531dd5dfc0db44b51"} Apr 23 01:15:57.653583 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:57.653580 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="000bfd8c32e1bb89307d9aeaa500dc68dd2495c287c5eb8531dd5dfc0db44b51" Apr 23 01:15:57.653776 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:57.653581 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c958c6v" Apr 23 01:15:57.654961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:57.654934 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" event={"ID":"d9c99aae-b547-4c5d-b3cc-648cbdd109b2","Type":"ContainerStarted","Data":"a7e5b4effad96aae159af9c131bc93bfba14da17a3445985afb87e1b0cc9bd19"} Apr 23 01:15:57.655104 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:57.655092 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:15:57.764313 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:15:57.764265 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" podStartSLOduration=2.150050025 podStartE2EDuration="4.764249431s" podCreationTimestamp="2026-04-23 01:15:53 +0000 UTC" firstStartedPulling="2026-04-23 01:15:54.115695505 +0000 UTC m=+341.075548948" lastFinishedPulling="2026-04-23 01:15:56.729894913 +0000 UTC m=+343.689748354" observedRunningTime="2026-04-23 01:15:57.680859452 +0000 UTC m=+344.640712913" watchObservedRunningTime="2026-04-23 01:15:57.764249431 +0000 UTC m=+344.724102894" Apr 23 01:16:08.660554 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:08.660478 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-cxw88" Apr 23 01:16:10.757961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.757928 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv"] Apr 23 01:16:10.758335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.758258 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerName="util" Apr 23 01:16:10.758335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.758270 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerName="util" Apr 23 01:16:10.758335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.758279 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerName="extract" Apr 23 01:16:10.758335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.758285 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerName="extract" Apr 23 01:16:10.758335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.758293 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerName="pull" Apr 23 01:16:10.758335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.758298 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerName="pull" Apr 23 01:16:10.758539 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.758343 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6467e5b-38c7-4fb1-bc5a-6cbca50379d3" containerName="extract" Apr 23 01:16:10.761453 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.761437 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.763971 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.763947 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 01:16:10.764137 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.764121 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mpphq\"" Apr 23 01:16:10.765195 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.765181 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 01:16:10.773768 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.773752 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv"] Apr 23 01:16:10.882847 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.882816 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.883025 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.882872 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm7tl\" (UniqueName: \"kubernetes.io/projected/f4724335-c07f-4a0c-800e-9a327475922f-kube-api-access-sm7tl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.883025 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.882911 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.983498 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.983463 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.983670 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.983527 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sm7tl\" (UniqueName: \"kubernetes.io/projected/f4724335-c07f-4a0c-800e-9a327475922f-kube-api-access-sm7tl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.983670 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.983554 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.983842 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.983822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.983901 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.983856 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:10.992081 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:10.992051 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm7tl\" (UniqueName: \"kubernetes.io/projected/f4724335-c07f-4a0c-800e-9a327475922f-kube-api-access-sm7tl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:11.070669 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:11.070600 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:11.187581 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:11.187560 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv"] Apr 23 01:16:11.189935 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:16:11.189905 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4724335_c07f_4a0c_800e_9a327475922f.slice/crio-6718cd443143b1bb2e4e3e8a6e6284d8ec3e96ff7a909e48bfd3b152dd5c7ae4 WatchSource:0}: Error finding container 6718cd443143b1bb2e4e3e8a6e6284d8ec3e96ff7a909e48bfd3b152dd5c7ae4: Status 404 returned error can't find the container with id 6718cd443143b1bb2e4e3e8a6e6284d8ec3e96ff7a909e48bfd3b152dd5c7ae4 Apr 23 01:16:11.698513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:11.698484 2565 generic.go:358] "Generic (PLEG): container finished" podID="f4724335-c07f-4a0c-800e-9a327475922f" containerID="de437aeeb3a34fca20d9418e17df95928fa614eb3ffbdcbab15fb617f1e36e1a" exitCode=0 Apr 23 01:16:11.698644 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:11.698569 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" event={"ID":"f4724335-c07f-4a0c-800e-9a327475922f","Type":"ContainerDied","Data":"de437aeeb3a34fca20d9418e17df95928fa614eb3ffbdcbab15fb617f1e36e1a"} Apr 23 01:16:11.698644 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:11.698601 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" event={"ID":"f4724335-c07f-4a0c-800e-9a327475922f","Type":"ContainerStarted","Data":"6718cd443143b1bb2e4e3e8a6e6284d8ec3e96ff7a909e48bfd3b152dd5c7ae4"} Apr 23 01:16:12.001867 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.001836 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-8596599875-gnq79"] Apr 23 01:16:12.004701 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.004685 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.007243 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.007222 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 23 01:16:12.007243 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.007239 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 23 01:16:12.007398 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.007364 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-nmrtz\"" Apr 23 01:16:12.013024 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.012995 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-8596599875-gnq79"] Apr 23 01:16:12.091243 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.091222 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/693a7cef-37fb-4989-a8cc-6ae494d989a1-tmp\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.091357 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.091279 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/693a7cef-37fb-4989-a8cc-6ae494d989a1-tls-certs\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.091402 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.091366 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gn67\" (UniqueName: \"kubernetes.io/projected/693a7cef-37fb-4989-a8cc-6ae494d989a1-kube-api-access-2gn67\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.191859 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.191834 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/693a7cef-37fb-4989-a8cc-6ae494d989a1-tmp\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.191943 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.191897 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/693a7cef-37fb-4989-a8cc-6ae494d989a1-tls-certs\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.192010 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.191939 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gn67\" (UniqueName: \"kubernetes.io/projected/693a7cef-37fb-4989-a8cc-6ae494d989a1-kube-api-access-2gn67\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.194119 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.194097 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/693a7cef-37fb-4989-a8cc-6ae494d989a1-tmp\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.194317 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.194301 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/693a7cef-37fb-4989-a8cc-6ae494d989a1-tls-certs\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.199067 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.199049 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gn67\" (UniqueName: \"kubernetes.io/projected/693a7cef-37fb-4989-a8cc-6ae494d989a1-kube-api-access-2gn67\") pod \"kube-auth-proxy-8596599875-gnq79\" (UID: \"693a7cef-37fb-4989-a8cc-6ae494d989a1\") " pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.314758 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.314713 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" Apr 23 01:16:12.451507 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.451481 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-8596599875-gnq79"] Apr 23 01:16:12.455295 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:16:12.455267 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod693a7cef_37fb_4989_a8cc_6ae494d989a1.slice/crio-3632754464bc8e12fe0191a663f79830b4bdafa1bf1b207a10ceafe984a95281 WatchSource:0}: Error finding container 3632754464bc8e12fe0191a663f79830b4bdafa1bf1b207a10ceafe984a95281: Status 404 returned error can't find the container with id 3632754464bc8e12fe0191a663f79830b4bdafa1bf1b207a10ceafe984a95281 Apr 23 01:16:12.702489 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.702462 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" event={"ID":"693a7cef-37fb-4989-a8cc-6ae494d989a1","Type":"ContainerStarted","Data":"3632754464bc8e12fe0191a663f79830b4bdafa1bf1b207a10ceafe984a95281"} Apr 23 01:16:12.704111 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.704087 2565 generic.go:358] "Generic (PLEG): container finished" podID="f4724335-c07f-4a0c-800e-9a327475922f" containerID="521b237a50f23d51a673e7d09875fc646846192d16262c40db56dd500d482490" exitCode=0 Apr 23 01:16:12.704217 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:12.704168 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" event={"ID":"f4724335-c07f-4a0c-800e-9a327475922f","Type":"ContainerDied","Data":"521b237a50f23d51a673e7d09875fc646846192d16262c40db56dd500d482490"} Apr 23 01:16:13.230395 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.230356 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz"] Apr 23 01:16:13.234044 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.234018 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.237127 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.236905 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:16:13.238890 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.238711 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 23 01:16:13.238890 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.238737 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 23 01:16:13.238890 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.238776 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 23 01:16:13.239293 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.239035 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-slpqr\"" Apr 23 01:16:13.239293 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.239052 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 23 01:16:13.242395 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.242371 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz"] Apr 23 01:16:13.301435 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.301186 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hsq\" (UniqueName: \"kubernetes.io/projected/a5b48147-74ff-45de-b9df-251ff995fea3-kube-api-access-t7hsq\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.301435 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.301262 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a5b48147-74ff-45de-b9df-251ff995fea3-manager-config\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.301435 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.301303 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5b48147-74ff-45de-b9df-251ff995fea3-cert\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.301435 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.301369 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5b48147-74ff-45de-b9df-251ff995fea3-metrics-cert\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.402670 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.402633 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5b48147-74ff-45de-b9df-251ff995fea3-metrics-cert\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.402853 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.402679 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hsq\" (UniqueName: \"kubernetes.io/projected/a5b48147-74ff-45de-b9df-251ff995fea3-kube-api-access-t7hsq\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.402853 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.402735 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a5b48147-74ff-45de-b9df-251ff995fea3-manager-config\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.402853 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.402772 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5b48147-74ff-45de-b9df-251ff995fea3-cert\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.403588 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.403551 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a5b48147-74ff-45de-b9df-251ff995fea3-manager-config\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.405702 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.405675 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5b48147-74ff-45de-b9df-251ff995fea3-cert\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.405803 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.405773 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5b48147-74ff-45de-b9df-251ff995fea3-metrics-cert\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.420581 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.420533 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hsq\" (UniqueName: \"kubernetes.io/projected/a5b48147-74ff-45de-b9df-251ff995fea3-kube-api-access-t7hsq\") pod \"lws-controller-manager-6b799cbd77-jtwlz\" (UID: \"a5b48147-74ff-45de-b9df-251ff995fea3\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.549080 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.549003 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-slpqr\"" Apr 23 01:16:13.556889 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.556863 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:13.702667 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.702539 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz"] Apr 23 01:16:13.710820 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.710782 2565 generic.go:358] "Generic (PLEG): container finished" podID="f4724335-c07f-4a0c-800e-9a327475922f" containerID="7c5d893cd9b0877b9f4e56835ddd0f72be921472350a0a1918473e62ea117af6" exitCode=0 Apr 23 01:16:13.710950 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:13.710818 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" event={"ID":"f4724335-c07f-4a0c-800e-9a327475922f","Type":"ContainerDied","Data":"7c5d893cd9b0877b9f4e56835ddd0f72be921472350a0a1918473e62ea117af6"} Apr 23 01:16:14.399701 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:16:14.399665 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5b48147_74ff_45de_b9df_251ff995fea3.slice/crio-e418ede24dc710a5e529578850e89f9711722244a3944fc931c20a8a67956cd4 WatchSource:0}: Error finding container e418ede24dc710a5e529578850e89f9711722244a3944fc931c20a8a67956cd4: Status 404 returned error can't find the container with id e418ede24dc710a5e529578850e89f9711722244a3944fc931c20a8a67956cd4 Apr 23 01:16:14.714718 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:14.714637 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" event={"ID":"a5b48147-74ff-45de-b9df-251ff995fea3","Type":"ContainerStarted","Data":"e418ede24dc710a5e529578850e89f9711722244a3944fc931c20a8a67956cd4"} Apr 23 01:16:14.849334 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:14.849312 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:14.917512 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:14.917478 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-bundle\") pod \"f4724335-c07f-4a0c-800e-9a327475922f\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " Apr 23 01:16:14.917659 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:14.917536 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm7tl\" (UniqueName: \"kubernetes.io/projected/f4724335-c07f-4a0c-800e-9a327475922f-kube-api-access-sm7tl\") pod \"f4724335-c07f-4a0c-800e-9a327475922f\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " Apr 23 01:16:14.917659 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:14.917637 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-util\") pod \"f4724335-c07f-4a0c-800e-9a327475922f\" (UID: \"f4724335-c07f-4a0c-800e-9a327475922f\") " Apr 23 01:16:14.918877 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:14.918837 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-bundle" (OuterVolumeSpecName: "bundle") pod "f4724335-c07f-4a0c-800e-9a327475922f" (UID: "f4724335-c07f-4a0c-800e-9a327475922f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:16:14.920047 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:14.919944 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4724335-c07f-4a0c-800e-9a327475922f-kube-api-access-sm7tl" (OuterVolumeSpecName: "kube-api-access-sm7tl") pod "f4724335-c07f-4a0c-800e-9a327475922f" (UID: "f4724335-c07f-4a0c-800e-9a327475922f"). InnerVolumeSpecName "kube-api-access-sm7tl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:16:14.926208 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:14.926167 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-util" (OuterVolumeSpecName: "util") pod "f4724335-c07f-4a0c-800e-9a327475922f" (UID: "f4724335-c07f-4a0c-800e-9a327475922f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:16:15.018678 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:15.018606 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:16:15.018678 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:15.018636 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f4724335-c07f-4a0c-800e-9a327475922f-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:16:15.018678 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:15.018651 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sm7tl\" (UniqueName: \"kubernetes.io/projected/f4724335-c07f-4a0c-800e-9a327475922f-kube-api-access-sm7tl\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:16:15.720275 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:15.720235 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" event={"ID":"f4724335-c07f-4a0c-800e-9a327475922f","Type":"ContainerDied","Data":"6718cd443143b1bb2e4e3e8a6e6284d8ec3e96ff7a909e48bfd3b152dd5c7ae4"} Apr 23 01:16:15.720275 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:15.720255 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835kpncv" Apr 23 01:16:15.720275 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:15.720274 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6718cd443143b1bb2e4e3e8a6e6284d8ec3e96ff7a909e48bfd3b152dd5c7ae4" Apr 23 01:16:16.370066 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:16.370043 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 23 01:16:16.724631 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:16.724595 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" event={"ID":"a5b48147-74ff-45de-b9df-251ff995fea3","Type":"ContainerStarted","Data":"1c9f72a5432f0672a456e062cfa0133e98f1a3af1ec4121e492778f055c5592a"} Apr 23 01:16:16.725059 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:16.724654 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:16.725868 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:16.725843 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" event={"ID":"693a7cef-37fb-4989-a8cc-6ae494d989a1","Type":"ContainerStarted","Data":"5c0d633ed57e2007b27a360a04eaf2093f247ff7b93b2443ed59ea52410f1af1"} Apr 23 01:16:16.740875 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:16.740838 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" podStartSLOduration=1.7897977 podStartE2EDuration="3.74082612s" podCreationTimestamp="2026-04-23 01:16:13 +0000 UTC" firstStartedPulling="2026-04-23 01:16:14.40199319 +0000 UTC m=+361.361846641" lastFinishedPulling="2026-04-23 01:16:16.353021607 +0000 UTC m=+363.312875061" observedRunningTime="2026-04-23 01:16:16.738889758 +0000 UTC m=+363.698743218" watchObservedRunningTime="2026-04-23 01:16:16.74082612 +0000 UTC m=+363.700679616" Apr 23 01:16:16.754679 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:16.754642 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-8596599875-gnq79" podStartSLOduration=1.844180506 podStartE2EDuration="5.754629132s" podCreationTimestamp="2026-04-23 01:16:11 +0000 UTC" firstStartedPulling="2026-04-23 01:16:12.457023922 +0000 UTC m=+359.416877360" lastFinishedPulling="2026-04-23 01:16:16.367472535 +0000 UTC m=+363.327325986" observedRunningTime="2026-04-23 01:16:16.753091585 +0000 UTC m=+363.712945044" watchObservedRunningTime="2026-04-23 01:16:16.754629132 +0000 UTC m=+363.714482607" Apr 23 01:16:24.547402 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.547367 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f"] Apr 23 01:16:24.547772 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.547646 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4724335-c07f-4a0c-800e-9a327475922f" containerName="util" Apr 23 01:16:24.547772 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.547656 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4724335-c07f-4a0c-800e-9a327475922f" containerName="util" Apr 23 01:16:24.547772 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.547671 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4724335-c07f-4a0c-800e-9a327475922f" containerName="pull" Apr 23 01:16:24.547772 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.547678 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4724335-c07f-4a0c-800e-9a327475922f" containerName="pull" Apr 23 01:16:24.547772 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.547685 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4724335-c07f-4a0c-800e-9a327475922f" containerName="extract" Apr 23 01:16:24.547772 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.547690 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4724335-c07f-4a0c-800e-9a327475922f" containerName="extract" Apr 23 01:16:24.547772 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.547742 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4724335-c07f-4a0c-800e-9a327475922f" containerName="extract" Apr 23 01:16:24.550883 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.550866 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.554090 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.554060 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mpphq\"" Apr 23 01:16:24.554225 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.554077 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 01:16:24.555281 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.555262 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 01:16:24.560675 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.560653 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f"] Apr 23 01:16:24.697909 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.697867 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmbvn\" (UniqueName: \"kubernetes.io/projected/17d1a0f6-5c81-4f52-892e-d1398c2deab2-kube-api-access-bmbvn\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.697909 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.697911 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.698145 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.698016 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.798684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.798598 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.798684 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.798665 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.798914 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.798703 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmbvn\" (UniqueName: \"kubernetes.io/projected/17d1a0f6-5c81-4f52-892e-d1398c2deab2-kube-api-access-bmbvn\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.799088 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.799064 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.799158 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.799087 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.809497 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.809475 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmbvn\" (UniqueName: \"kubernetes.io/projected/17d1a0f6-5c81-4f52-892e-d1398c2deab2-kube-api-access-bmbvn\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.860629 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.860601 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:24.989585 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:24.989528 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f"] Apr 23 01:16:24.991809 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:16:24.991776 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17d1a0f6_5c81_4f52_892e_d1398c2deab2.slice/crio-a98a9fbf4732f2c18c0ac3dbd777e478c5d2df0bec943871bf3fb32dac9eef05 WatchSource:0}: Error finding container a98a9fbf4732f2c18c0ac3dbd777e478c5d2df0bec943871bf3fb32dac9eef05: Status 404 returned error can't find the container with id a98a9fbf4732f2c18c0ac3dbd777e478c5d2df0bec943871bf3fb32dac9eef05 Apr 23 01:16:25.754389 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:25.754355 2565 generic.go:358] "Generic (PLEG): container finished" podID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerID="689c3baf0a9694248f0526bb17fed239becac35a95210054aa6e0f7a29ad56f0" exitCode=0 Apr 23 01:16:25.754758 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:25.754440 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" event={"ID":"17d1a0f6-5c81-4f52-892e-d1398c2deab2","Type":"ContainerDied","Data":"689c3baf0a9694248f0526bb17fed239becac35a95210054aa6e0f7a29ad56f0"} Apr 23 01:16:25.754758 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:25.754471 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" event={"ID":"17d1a0f6-5c81-4f52-892e-d1398c2deab2","Type":"ContainerStarted","Data":"a98a9fbf4732f2c18c0ac3dbd777e478c5d2df0bec943871bf3fb32dac9eef05"} Apr 23 01:16:27.730912 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:27.730883 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-jtwlz" Apr 23 01:16:32.783091 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:32.783062 2565 generic.go:358] "Generic (PLEG): container finished" podID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerID="51e4d29115f5f85b76b9faa2ce4f56683a9129f5d98648d274be668c93b80c0c" exitCode=0 Apr 23 01:16:32.783455 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:32.783143 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" event={"ID":"17d1a0f6-5c81-4f52-892e-d1398c2deab2","Type":"ContainerDied","Data":"51e4d29115f5f85b76b9faa2ce4f56683a9129f5d98648d274be668c93b80c0c"} Apr 23 01:16:33.788345 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:33.788304 2565 generic.go:358] "Generic (PLEG): container finished" podID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerID="1de69b5221f6cc4328fa3489503981ebe7b5d3fa4ac4eabaa878575efefb1744" exitCode=0 Apr 23 01:16:33.788690 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:33.788383 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" event={"ID":"17d1a0f6-5c81-4f52-892e-d1398c2deab2","Type":"ContainerDied","Data":"1de69b5221f6cc4328fa3489503981ebe7b5d3fa4ac4eabaa878575efefb1744"} Apr 23 01:16:34.916486 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:34.916460 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:16:35.085008 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.084884 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-bundle\") pod \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " Apr 23 01:16:35.085191 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.085013 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmbvn\" (UniqueName: \"kubernetes.io/projected/17d1a0f6-5c81-4f52-892e-d1398c2deab2-kube-api-access-bmbvn\") pod \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " Apr 23 01:16:35.085191 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.085038 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-util\") pod \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\" (UID: \"17d1a0f6-5c81-4f52-892e-d1398c2deab2\") " Apr 23 01:16:35.085918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.085892 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-bundle" (OuterVolumeSpecName: "bundle") pod "17d1a0f6-5c81-4f52-892e-d1398c2deab2" (UID: "17d1a0f6-5c81-4f52-892e-d1398c2deab2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:16:35.087077 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.087051 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17d1a0f6-5c81-4f52-892e-d1398c2deab2-kube-api-access-bmbvn" (OuterVolumeSpecName: "kube-api-access-bmbvn") pod "17d1a0f6-5c81-4f52-892e-d1398c2deab2" (UID: "17d1a0f6-5c81-4f52-892e-d1398c2deab2"). InnerVolumeSpecName "kube-api-access-bmbvn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:16:35.092015 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.091964 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-util" (OuterVolumeSpecName: "util") pod "17d1a0f6-5c81-4f52-892e-d1398c2deab2" (UID: "17d1a0f6-5c81-4f52-892e-d1398c2deab2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:16:35.186330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.186287 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:16:35.186330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.186325 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bmbvn\" (UniqueName: \"kubernetes.io/projected/17d1a0f6-5c81-4f52-892e-d1398c2deab2-kube-api-access-bmbvn\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:16:35.186330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.186335 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17d1a0f6-5c81-4f52-892e-d1398c2deab2-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:16:35.796347 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.796312 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" event={"ID":"17d1a0f6-5c81-4f52-892e-d1398c2deab2","Type":"ContainerDied","Data":"a98a9fbf4732f2c18c0ac3dbd777e478c5d2df0bec943871bf3fb32dac9eef05"} Apr 23 01:16:35.796347 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.796346 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98a9fbf4732f2c18c0ac3dbd777e478c5d2df0bec943871bf3fb32dac9eef05" Apr 23 01:16:35.796550 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:16:35.796363 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebd7f7f" Apr 23 01:17:26.686622 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.686582 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d"] Apr 23 01:17:26.687186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.686916 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerName="pull" Apr 23 01:17:26.687186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.686932 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerName="pull" Apr 23 01:17:26.687186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.686969 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerName="util" Apr 23 01:17:26.687186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.686991 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerName="util" Apr 23 01:17:26.687186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.687004 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerName="extract" Apr 23 01:17:26.687186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.687012 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerName="extract" Apr 23 01:17:26.687186 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.687091 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="17d1a0f6-5c81-4f52-892e-d1398c2deab2" containerName="extract" Apr 23 01:17:26.690654 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.690637 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.693304 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.693280 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 01:17:26.694515 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.694498 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4dlc4\"" Apr 23 01:17:26.694577 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.694538 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 01:17:26.697627 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.697602 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d"] Apr 23 01:17:26.787876 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.787834 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4pq\" (UniqueName: \"kubernetes.io/projected/11c5f374-0168-48de-b916-7038711c16c8-kube-api-access-jb4pq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.788086 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.787936 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.788086 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.788024 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.889045 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.888971 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4pq\" (UniqueName: \"kubernetes.io/projected/11c5f374-0168-48de-b916-7038711c16c8-kube-api-access-jb4pq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.889240 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.889068 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.889240 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.889104 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.889475 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.889459 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.889512 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.889497 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:26.897352 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:26.897319 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4pq\" (UniqueName: \"kubernetes.io/projected/11c5f374-0168-48de-b916-7038711c16c8-kube-api-access-jb4pq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:27.001540 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.001443 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:27.087345 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.087311 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d"] Apr 23 01:17:27.092206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.092177 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.097096 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.097061 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d"] Apr 23 01:17:27.131908 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.131882 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d"] Apr 23 01:17:27.134411 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:17:27.134378 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11c5f374_0168_48de_b916_7038711c16c8.slice/crio-94dbe0a1164dc6976bb6158d8391353f7bb24f3ed9398408bd5597bea33fc248 WatchSource:0}: Error finding container 94dbe0a1164dc6976bb6158d8391353f7bb24f3ed9398408bd5597bea33fc248: Status 404 returned error can't find the container with id 94dbe0a1164dc6976bb6158d8391353f7bb24f3ed9398408bd5597bea33fc248 Apr 23 01:17:27.191458 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.191423 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.191564 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.191496 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2x95\" (UniqueName: \"kubernetes.io/projected/c4534e35-eb66-433b-9d99-6c8855233dfa-kube-api-access-p2x95\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.191661 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.191638 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.293203 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.293107 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.293203 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.293170 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2x95\" (UniqueName: \"kubernetes.io/projected/c4534e35-eb66-433b-9d99-6c8855233dfa-kube-api-access-p2x95\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.293396 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.293222 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.293552 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.293526 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.293671 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.293584 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.301708 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.301671 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2x95\" (UniqueName: \"kubernetes.io/projected/c4534e35-eb66-433b-9d99-6c8855233dfa-kube-api-access-p2x95\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.405749 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.405712 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:27.531047 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.531017 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d"] Apr 23 01:17:27.532443 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:17:27.532404 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4534e35_eb66_433b_9d99_6c8855233dfa.slice/crio-a06295dfd46f02bc1c0f37bd76756a819a933764d3fd9433c6beda6c68cd04c9 WatchSource:0}: Error finding container a06295dfd46f02bc1c0f37bd76756a819a933764d3fd9433c6beda6c68cd04c9: Status 404 returned error can't find the container with id a06295dfd46f02bc1c0f37bd76756a819a933764d3fd9433c6beda6c68cd04c9 Apr 23 01:17:27.685076 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.685031 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6"] Apr 23 01:17:27.688482 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.688463 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.694102 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.694070 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6"] Apr 23 01:17:27.797455 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.797411 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8999\" (UniqueName: \"kubernetes.io/projected/d1cd9850-356a-4f75-83b7-40ec06bf50bd-kube-api-access-s8999\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.797640 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.797489 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.797640 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.797516 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.898548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.898515 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8999\" (UniqueName: \"kubernetes.io/projected/d1cd9850-356a-4f75-83b7-40ec06bf50bd-kube-api-access-s8999\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.898707 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.898570 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.898707 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.898598 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.898945 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.898928 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.899058 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.899031 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.907392 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.907368 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8999\" (UniqueName: \"kubernetes.io/projected/d1cd9850-356a-4f75-83b7-40ec06bf50bd-kube-api-access-s8999\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:27.978441 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.978401 2565 generic.go:358] "Generic (PLEG): container finished" podID="11c5f374-0168-48de-b916-7038711c16c8" containerID="e92b412d7041329b7bb9a52488cb1ef99b04f6b6ad424901806a42f8aa7a086a" exitCode=0 Apr 23 01:17:27.978622 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.978493 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" event={"ID":"11c5f374-0168-48de-b916-7038711c16c8","Type":"ContainerDied","Data":"e92b412d7041329b7bb9a52488cb1ef99b04f6b6ad424901806a42f8aa7a086a"} Apr 23 01:17:27.978622 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.978534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" event={"ID":"11c5f374-0168-48de-b916-7038711c16c8","Type":"ContainerStarted","Data":"94dbe0a1164dc6976bb6158d8391353f7bb24f3ed9398408bd5597bea33fc248"} Apr 23 01:17:27.982631 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.980289 2565 generic.go:358] "Generic (PLEG): container finished" podID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerID="c2182b6815c0b871b0a700ebd191e908052c764d9d943c7afb52736318dcbb2a" exitCode=0 Apr 23 01:17:27.982631 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.980396 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" event={"ID":"c4534e35-eb66-433b-9d99-6c8855233dfa","Type":"ContainerDied","Data":"c2182b6815c0b871b0a700ebd191e908052c764d9d943c7afb52736318dcbb2a"} Apr 23 01:17:27.982631 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:27.980419 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" event={"ID":"c4534e35-eb66-433b-9d99-6c8855233dfa","Type":"ContainerStarted","Data":"a06295dfd46f02bc1c0f37bd76756a819a933764d3fd9433c6beda6c68cd04c9"} Apr 23 01:17:28.009857 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.009823 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:28.136553 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.136522 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6"] Apr 23 01:17:28.138364 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:17:28.138329 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1cd9850_356a_4f75_83b7_40ec06bf50bd.slice/crio-f6a2b65d912ad11332409870a213abba611cb7597417d8b85b4fd21df5ab53e7 WatchSource:0}: Error finding container f6a2b65d912ad11332409870a213abba611cb7597417d8b85b4fd21df5ab53e7: Status 404 returned error can't find the container with id f6a2b65d912ad11332409870a213abba611cb7597417d8b85b4fd21df5ab53e7 Apr 23 01:17:28.290239 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.290195 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l"] Apr 23 01:17:28.293654 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.293628 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.300144 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.300115 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l"] Apr 23 01:17:28.403600 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.403559 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.403802 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.403616 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bbk\" (UniqueName: \"kubernetes.io/projected/1e52617f-6882-4d96-8047-3acbbcef7a99-kube-api-access-l4bbk\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.403802 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.403635 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.505180 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.505078 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bbk\" (UniqueName: \"kubernetes.io/projected/1e52617f-6882-4d96-8047-3acbbcef7a99-kube-api-access-l4bbk\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.505180 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.505123 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.505427 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.505186 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.505584 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.505564 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.505652 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.505610 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.514071 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.514044 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bbk\" (UniqueName: \"kubernetes.io/projected/1e52617f-6882-4d96-8047-3acbbcef7a99-kube-api-access-l4bbk\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.604071 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.604036 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:28.738100 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.738072 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l"] Apr 23 01:17:28.781201 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:17:28.781116 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e52617f_6882_4d96_8047_3acbbcef7a99.slice/crio-9e20a231c1f247a77143d8527ed76a6f54756b440094620af71da3d5efc6fd21 WatchSource:0}: Error finding container 9e20a231c1f247a77143d8527ed76a6f54756b440094620af71da3d5efc6fd21: Status 404 returned error can't find the container with id 9e20a231c1f247a77143d8527ed76a6f54756b440094620af71da3d5efc6fd21 Apr 23 01:17:28.987487 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.987439 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" event={"ID":"c4534e35-eb66-433b-9d99-6c8855233dfa","Type":"ContainerStarted","Data":"936b48a054f9d2412397fe83691671f84e4ad3e94aefa5f92b9cfe1334425d1d"} Apr 23 01:17:28.989507 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.989466 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" event={"ID":"1e52617f-6882-4d96-8047-3acbbcef7a99","Type":"ContainerStarted","Data":"5a46769c27a76387c9a4642b64aaed54b612717611edfdc16d3caf77c456904b"} Apr 23 01:17:28.989507 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.989514 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" event={"ID":"1e52617f-6882-4d96-8047-3acbbcef7a99","Type":"ContainerStarted","Data":"9e20a231c1f247a77143d8527ed76a6f54756b440094620af71da3d5efc6fd21"} Apr 23 01:17:28.991711 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.991607 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" event={"ID":"11c5f374-0168-48de-b916-7038711c16c8","Type":"ContainerStarted","Data":"bda944f14eaa65d1cc8b7049da42ba99c26f6760e7192f8e06568b65d0eae005"} Apr 23 01:17:28.993393 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.993349 2565 generic.go:358] "Generic (PLEG): container finished" podID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerID="fbb6a93ea5ab59a2f5c0585de70173f0bcb265af43c9d08d485611f2cdb916ab" exitCode=0 Apr 23 01:17:28.993741 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.993430 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" event={"ID":"d1cd9850-356a-4f75-83b7-40ec06bf50bd","Type":"ContainerDied","Data":"fbb6a93ea5ab59a2f5c0585de70173f0bcb265af43c9d08d485611f2cdb916ab"} Apr 23 01:17:28.993741 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:28.993455 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" event={"ID":"d1cd9850-356a-4f75-83b7-40ec06bf50bd","Type":"ContainerStarted","Data":"f6a2b65d912ad11332409870a213abba611cb7597417d8b85b4fd21df5ab53e7"} Apr 23 01:17:30.001036 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:30.000915 2565 generic.go:358] "Generic (PLEG): container finished" podID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerID="5a46769c27a76387c9a4642b64aaed54b612717611edfdc16d3caf77c456904b" exitCode=0 Apr 23 01:17:30.001036 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:30.001002 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" event={"ID":"1e52617f-6882-4d96-8047-3acbbcef7a99","Type":"ContainerDied","Data":"5a46769c27a76387c9a4642b64aaed54b612717611edfdc16d3caf77c456904b"} Apr 23 01:17:30.002680 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:30.002653 2565 generic.go:358] "Generic (PLEG): container finished" podID="11c5f374-0168-48de-b916-7038711c16c8" containerID="bda944f14eaa65d1cc8b7049da42ba99c26f6760e7192f8e06568b65d0eae005" exitCode=0 Apr 23 01:17:30.002800 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:30.002719 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" event={"ID":"11c5f374-0168-48de-b916-7038711c16c8","Type":"ContainerDied","Data":"bda944f14eaa65d1cc8b7049da42ba99c26f6760e7192f8e06568b65d0eae005"} Apr 23 01:17:30.004427 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:30.004375 2565 generic.go:358] "Generic (PLEG): container finished" podID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerID="36375d65fe5dd164d8c95126cb409691bb24ba817bfd8f39b878d6dfc92b177c" exitCode=0 Apr 23 01:17:30.004585 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:30.004497 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" event={"ID":"d1cd9850-356a-4f75-83b7-40ec06bf50bd","Type":"ContainerDied","Data":"36375d65fe5dd164d8c95126cb409691bb24ba817bfd8f39b878d6dfc92b177c"} Apr 23 01:17:30.006232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:30.006204 2565 generic.go:358] "Generic (PLEG): container finished" podID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerID="936b48a054f9d2412397fe83691671f84e4ad3e94aefa5f92b9cfe1334425d1d" exitCode=0 Apr 23 01:17:30.006324 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:30.006282 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" event={"ID":"c4534e35-eb66-433b-9d99-6c8855233dfa","Type":"ContainerDied","Data":"936b48a054f9d2412397fe83691671f84e4ad3e94aefa5f92b9cfe1334425d1d"} Apr 23 01:17:31.012032 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:31.011901 2565 generic.go:358] "Generic (PLEG): container finished" podID="11c5f374-0168-48de-b916-7038711c16c8" containerID="9b70fa8d3c51e00b87c05980c96d24d77eb464eed61381de6f8f2a2a016e92c7" exitCode=0 Apr 23 01:17:31.012032 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:31.012000 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" event={"ID":"11c5f374-0168-48de-b916-7038711c16c8","Type":"ContainerDied","Data":"9b70fa8d3c51e00b87c05980c96d24d77eb464eed61381de6f8f2a2a016e92c7"} Apr 23 01:17:31.013808 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:31.013782 2565 generic.go:358] "Generic (PLEG): container finished" podID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerID="6988ff8ca1212cf26f9677bdf2c220ed2e68bc79951a6fd2248a3a3d319542c5" exitCode=0 Apr 23 01:17:31.013925 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:31.013858 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" event={"ID":"d1cd9850-356a-4f75-83b7-40ec06bf50bd","Type":"ContainerDied","Data":"6988ff8ca1212cf26f9677bdf2c220ed2e68bc79951a6fd2248a3a3d319542c5"} Apr 23 01:17:31.015730 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:31.015692 2565 generic.go:358] "Generic (PLEG): container finished" podID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerID="b7406e905c295b5d47ecf3c772c2f718d8ab6581fb018fa34e526fdc709dd661" exitCode=0 Apr 23 01:17:31.015849 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:31.015771 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" event={"ID":"c4534e35-eb66-433b-9d99-6c8855233dfa","Type":"ContainerDied","Data":"b7406e905c295b5d47ecf3c772c2f718d8ab6581fb018fa34e526fdc709dd661"} Apr 23 01:17:31.017479 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:31.017456 2565 generic.go:358] "Generic (PLEG): container finished" podID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerID="432ba1ca052a7cebeaaee4214330d5b079022300b2a82293c369b5bc0d053e4c" exitCode=0 Apr 23 01:17:31.017609 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:31.017488 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" event={"ID":"1e52617f-6882-4d96-8047-3acbbcef7a99","Type":"ContainerDied","Data":"432ba1ca052a7cebeaaee4214330d5b079022300b2a82293c369b5bc0d053e4c"} Apr 23 01:17:32.023600 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.023562 2565 generic.go:358] "Generic (PLEG): container finished" podID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerID="075f0188dfd43790a6e77725c4bd9bbee34cc7e7aa494d7476c458e30151b3e9" exitCode=0 Apr 23 01:17:32.024136 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.023640 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" event={"ID":"1e52617f-6882-4d96-8047-3acbbcef7a99","Type":"ContainerDied","Data":"075f0188dfd43790a6e77725c4bd9bbee34cc7e7aa494d7476c458e30151b3e9"} Apr 23 01:17:32.178547 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.178043 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:32.231297 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.231270 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:32.234749 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.234722 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:32.242209 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.242184 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8999\" (UniqueName: \"kubernetes.io/projected/d1cd9850-356a-4f75-83b7-40ec06bf50bd-kube-api-access-s8999\") pod \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " Apr 23 01:17:32.242335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.242227 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-bundle\") pod \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " Apr 23 01:17:32.242415 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.242399 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-util\") pod \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\" (UID: \"d1cd9850-356a-4f75-83b7-40ec06bf50bd\") " Apr 23 01:17:32.242924 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.242896 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-bundle" (OuterVolumeSpecName: "bundle") pod "d1cd9850-356a-4f75-83b7-40ec06bf50bd" (UID: "d1cd9850-356a-4f75-83b7-40ec06bf50bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:17:32.244579 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.244554 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1cd9850-356a-4f75-83b7-40ec06bf50bd-kube-api-access-s8999" (OuterVolumeSpecName: "kube-api-access-s8999") pod "d1cd9850-356a-4f75-83b7-40ec06bf50bd" (UID: "d1cd9850-356a-4f75-83b7-40ec06bf50bd"). InnerVolumeSpecName "kube-api-access-s8999". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:17:32.247787 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.247755 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-util" (OuterVolumeSpecName: "util") pod "d1cd9850-356a-4f75-83b7-40ec06bf50bd" (UID: "d1cd9850-356a-4f75-83b7-40ec06bf50bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:17:32.343421 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343299 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-util\") pod \"c4534e35-eb66-433b-9d99-6c8855233dfa\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " Apr 23 01:17:32.343421 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343353 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-bundle\") pod \"11c5f374-0168-48de-b916-7038711c16c8\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " Apr 23 01:17:32.343421 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343386 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb4pq\" (UniqueName: \"kubernetes.io/projected/11c5f374-0168-48de-b916-7038711c16c8-kube-api-access-jb4pq\") pod \"11c5f374-0168-48de-b916-7038711c16c8\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " Apr 23 01:17:32.343421 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343407 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-util\") pod \"11c5f374-0168-48de-b916-7038711c16c8\" (UID: \"11c5f374-0168-48de-b916-7038711c16c8\") " Apr 23 01:17:32.343936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343484 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2x95\" (UniqueName: \"kubernetes.io/projected/c4534e35-eb66-433b-9d99-6c8855233dfa-kube-api-access-p2x95\") pod \"c4534e35-eb66-433b-9d99-6c8855233dfa\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " Apr 23 01:17:32.343936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343528 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-bundle\") pod \"c4534e35-eb66-433b-9d99-6c8855233dfa\" (UID: \"c4534e35-eb66-433b-9d99-6c8855233dfa\") " Apr 23 01:17:32.343936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343726 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:32.343936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343738 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8999\" (UniqueName: \"kubernetes.io/projected/d1cd9850-356a-4f75-83b7-40ec06bf50bd-kube-api-access-s8999\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:32.343936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343751 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1cd9850-356a-4f75-83b7-40ec06bf50bd-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:32.343936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.343888 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-bundle" (OuterVolumeSpecName: "bundle") pod "11c5f374-0168-48de-b916-7038711c16c8" (UID: "11c5f374-0168-48de-b916-7038711c16c8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:17:32.344249 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.344212 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-bundle" (OuterVolumeSpecName: "bundle") pod "c4534e35-eb66-433b-9d99-6c8855233dfa" (UID: "c4534e35-eb66-433b-9d99-6c8855233dfa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:17:32.346191 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.346166 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c5f374-0168-48de-b916-7038711c16c8-kube-api-access-jb4pq" (OuterVolumeSpecName: "kube-api-access-jb4pq") pod "11c5f374-0168-48de-b916-7038711c16c8" (UID: "11c5f374-0168-48de-b916-7038711c16c8"). InnerVolumeSpecName "kube-api-access-jb4pq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:17:32.346314 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.346202 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4534e35-eb66-433b-9d99-6c8855233dfa-kube-api-access-p2x95" (OuterVolumeSpecName: "kube-api-access-p2x95") pod "c4534e35-eb66-433b-9d99-6c8855233dfa" (UID: "c4534e35-eb66-433b-9d99-6c8855233dfa"). InnerVolumeSpecName "kube-api-access-p2x95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:17:32.349608 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.349580 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-util" (OuterVolumeSpecName: "util") pod "c4534e35-eb66-433b-9d99-6c8855233dfa" (UID: "c4534e35-eb66-433b-9d99-6c8855233dfa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:17:32.445030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.444971 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p2x95\" (UniqueName: \"kubernetes.io/projected/c4534e35-eb66-433b-9d99-6c8855233dfa-kube-api-access-p2x95\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:32.445030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.445028 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:32.445030 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.445041 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4534e35-eb66-433b-9d99-6c8855233dfa-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:32.445270 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.445049 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:32.445270 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.445058 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jb4pq\" (UniqueName: \"kubernetes.io/projected/11c5f374-0168-48de-b916-7038711c16c8-kube-api-access-jb4pq\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:32.484088 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.484039 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-util" (OuterVolumeSpecName: "util") pod "11c5f374-0168-48de-b916-7038711c16c8" (UID: "11c5f374-0168-48de-b916-7038711c16c8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:17:32.545464 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:32.545420 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11c5f374-0168-48de-b916-7038711c16c8-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:33.029029 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.028999 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" Apr 23 01:17:33.029469 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.029000 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d" event={"ID":"c4534e35-eb66-433b-9d99-6c8855233dfa","Type":"ContainerDied","Data":"a06295dfd46f02bc1c0f37bd76756a819a933764d3fd9433c6beda6c68cd04c9"} Apr 23 01:17:33.029469 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.029117 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a06295dfd46f02bc1c0f37bd76756a819a933764d3fd9433c6beda6c68cd04c9" Apr 23 01:17:33.030715 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.030687 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" Apr 23 01:17:33.030865 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.030682 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d" event={"ID":"11c5f374-0168-48de-b916-7038711c16c8","Type":"ContainerDied","Data":"94dbe0a1164dc6976bb6158d8391353f7bb24f3ed9398408bd5597bea33fc248"} Apr 23 01:17:33.030865 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.030794 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94dbe0a1164dc6976bb6158d8391353f7bb24f3ed9398408bd5597bea33fc248" Apr 23 01:17:33.032597 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.032577 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" Apr 23 01:17:33.032597 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.032589 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6" event={"ID":"d1cd9850-356a-4f75-83b7-40ec06bf50bd","Type":"ContainerDied","Data":"f6a2b65d912ad11332409870a213abba611cb7597417d8b85b4fd21df5ab53e7"} Apr 23 01:17:33.032756 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.032617 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6a2b65d912ad11332409870a213abba611cb7597417d8b85b4fd21df5ab53e7" Apr 23 01:17:33.151216 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.151187 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:33.253075 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.253028 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-util\") pod \"1e52617f-6882-4d96-8047-3acbbcef7a99\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " Apr 23 01:17:33.253075 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.253072 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-bundle\") pod \"1e52617f-6882-4d96-8047-3acbbcef7a99\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " Apr 23 01:17:33.253299 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.253104 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bbk\" (UniqueName: \"kubernetes.io/projected/1e52617f-6882-4d96-8047-3acbbcef7a99-kube-api-access-l4bbk\") pod \"1e52617f-6882-4d96-8047-3acbbcef7a99\" (UID: \"1e52617f-6882-4d96-8047-3acbbcef7a99\") " Apr 23 01:17:33.253645 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.253619 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-bundle" (OuterVolumeSpecName: "bundle") pod "1e52617f-6882-4d96-8047-3acbbcef7a99" (UID: "1e52617f-6882-4d96-8047-3acbbcef7a99"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:17:33.255394 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.255363 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e52617f-6882-4d96-8047-3acbbcef7a99-kube-api-access-l4bbk" (OuterVolumeSpecName: "kube-api-access-l4bbk") pod "1e52617f-6882-4d96-8047-3acbbcef7a99" (UID: "1e52617f-6882-4d96-8047-3acbbcef7a99"). InnerVolumeSpecName "kube-api-access-l4bbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:17:33.260705 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.260656 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-util" (OuterVolumeSpecName: "util") pod "1e52617f-6882-4d96-8047-3acbbcef7a99" (UID: "1e52617f-6882-4d96-8047-3acbbcef7a99"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:17:33.354052 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.353949 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-util\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:33.354052 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.354004 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e52617f-6882-4d96-8047-3acbbcef7a99-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:33.354052 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:33.354015 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l4bbk\" (UniqueName: \"kubernetes.io/projected/1e52617f-6882-4d96-8047-3acbbcef7a99-kube-api-access-l4bbk\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:17:34.037617 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:34.037580 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" event={"ID":"1e52617f-6882-4d96-8047-3acbbcef7a99","Type":"ContainerDied","Data":"9e20a231c1f247a77143d8527ed76a6f54756b440094620af71da3d5efc6fd21"} Apr 23 01:17:34.037617 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:34.037608 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l" Apr 23 01:17:34.037617 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:34.037620 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e20a231c1f247a77143d8527ed76a6f54756b440094620af71da3d5efc6fd21" Apr 23 01:17:55.406850 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:17:55.406802 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fd49b8965-hclkn"] Apr 23 01:18:03.625563 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.625517 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq"] Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.625970 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerName="util" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626007 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerName="util" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626023 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11c5f374-0168-48de-b916-7038711c16c8" containerName="pull" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626031 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c5f374-0168-48de-b916-7038711c16c8" containerName="pull" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626042 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerName="util" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626051 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerName="util" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626067 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerName="extract" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626077 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerName="extract" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626094 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11c5f374-0168-48de-b916-7038711c16c8" containerName="util" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626103 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c5f374-0168-48de-b916-7038711c16c8" containerName="util" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626124 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerName="pull" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626132 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerName="pull" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626143 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerName="util" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626151 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerName="util" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626160 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerName="pull" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626168 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerName="pull" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626178 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerName="extract" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626186 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerName="extract" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626194 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerName="pull" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626202 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerName="pull" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626213 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerName="extract" Apr 23 01:18:03.626211 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626222 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerName="extract" Apr 23 01:18:03.627232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626231 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11c5f374-0168-48de-b916-7038711c16c8" containerName="extract" Apr 23 01:18:03.627232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626239 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c5f374-0168-48de-b916-7038711c16c8" containerName="extract" Apr 23 01:18:03.627232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626353 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e52617f-6882-4d96-8047-3acbbcef7a99" containerName="extract" Apr 23 01:18:03.627232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626369 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4534e35-eb66-433b-9d99-6c8855233dfa" containerName="extract" Apr 23 01:18:03.627232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626382 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1cd9850-356a-4f75-83b7-40ec06bf50bd" containerName="extract" Apr 23 01:18:03.627232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.626392 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="11c5f374-0168-48de-b916-7038711c16c8" containerName="extract" Apr 23 01:18:03.629697 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.629677 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.632719 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.632684 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 01:18:03.633619 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.633594 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 01:18:03.633755 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.633737 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 23 01:18:03.633994 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.633960 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4dlc4\"" Apr 23 01:18:03.634266 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.634246 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 23 01:18:03.636318 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.636286 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq"] Apr 23 01:18:03.701931 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.701884 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02a14d76-8ef3-486a-b525-4477611711e7-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.701931 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.701929 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgt6\" (UniqueName: \"kubernetes.io/projected/02a14d76-8ef3-486a-b525-4477611711e7-kube-api-access-gsgt6\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.702176 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.702072 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02a14d76-8ef3-486a-b525-4477611711e7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.803530 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.803479 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02a14d76-8ef3-486a-b525-4477611711e7-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.803530 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.803526 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgt6\" (UniqueName: \"kubernetes.io/projected/02a14d76-8ef3-486a-b525-4477611711e7-kube-api-access-gsgt6\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.803899 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.803590 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02a14d76-8ef3-486a-b525-4477611711e7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.804322 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.804293 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02a14d76-8ef3-486a-b525-4477611711e7-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.806256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.806236 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02a14d76-8ef3-486a-b525-4477611711e7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.810850 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.810824 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgt6\" (UniqueName: \"kubernetes.io/projected/02a14d76-8ef3-486a-b525-4477611711e7-kube-api-access-gsgt6\") pod \"kuadrant-console-plugin-6cb54b5c86-8sswq\" (UID: \"02a14d76-8ef3-486a-b525-4477611711e7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:03.940232 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:03.940166 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" Apr 23 01:18:04.075403 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:04.075374 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq"] Apr 23 01:18:04.076880 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:18:04.076842 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a14d76_8ef3_486a_b525_4477611711e7.slice/crio-466b68a7754340a5ab3cbf0f9b8c9d28397eb5153d5f5abd96953c54a54ceb59 WatchSource:0}: Error finding container 466b68a7754340a5ab3cbf0f9b8c9d28397eb5153d5f5abd96953c54a54ceb59: Status 404 returned error can't find the container with id 466b68a7754340a5ab3cbf0f9b8c9d28397eb5153d5f5abd96953c54a54ceb59 Apr 23 01:18:04.141706 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:04.141662 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" event={"ID":"02a14d76-8ef3-486a-b525-4477611711e7","Type":"ContainerStarted","Data":"466b68a7754340a5ab3cbf0f9b8c9d28397eb5153d5f5abd96953c54a54ceb59"} Apr 23 01:18:20.428256 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:20.428188 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fd49b8965-hclkn" podUID="b5faffb9-3444-4db1-869e-e329c8f61648" containerName="console" containerID="cri-o://3d12a0fbffa36fec0194641e6d73c9ee559a64e8c1d49217bba78211f602cbd6" gracePeriod=15 Apr 23 01:18:29.286130 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:29.286078 2565 patch_prober.go:28] interesting pod/console-5fd49b8965-hclkn container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.16:8443/health\": dial tcp 10.134.0.16:8443: connect: connection refused" start-of-body= Apr 23 01:18:29.286617 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:29.286162 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-5fd49b8965-hclkn" podUID="b5faffb9-3444-4db1-869e-e329c8f61648" containerName="console" probeResult="failure" output="Get \"https://10.134.0.16:8443/health\": dial tcp 10.134.0.16:8443: connect: connection refused" Apr 23 01:18:30.252821 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.252798 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fd49b8965-hclkn_b5faffb9-3444-4db1-869e-e329c8f61648/console/0.log" Apr 23 01:18:30.252914 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.252839 2565 generic.go:358] "Generic (PLEG): container finished" podID="b5faffb9-3444-4db1-869e-e329c8f61648" containerID="3d12a0fbffa36fec0194641e6d73c9ee559a64e8c1d49217bba78211f602cbd6" exitCode=2 Apr 23 01:18:30.252960 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.252913 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd49b8965-hclkn" event={"ID":"b5faffb9-3444-4db1-869e-e329c8f61648","Type":"ContainerDied","Data":"3d12a0fbffa36fec0194641e6d73c9ee559a64e8c1d49217bba78211f602cbd6"} Apr 23 01:18:30.288409 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.288383 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fd49b8965-hclkn_b5faffb9-3444-4db1-869e-e329c8f61648/console/0.log" Apr 23 01:18:30.288723 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.288451 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:18:30.458145 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458103 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-service-ca\") pod \"b5faffb9-3444-4db1-869e-e329c8f61648\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " Apr 23 01:18:30.458335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458149 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-trusted-ca-bundle\") pod \"b5faffb9-3444-4db1-869e-e329c8f61648\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " Apr 23 01:18:30.458335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458197 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-serving-cert\") pod \"b5faffb9-3444-4db1-869e-e329c8f61648\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " Apr 23 01:18:30.458335 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458306 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-oauth-serving-cert\") pod \"b5faffb9-3444-4db1-869e-e329c8f61648\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " Apr 23 01:18:30.458483 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458405 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpdhz\" (UniqueName: \"kubernetes.io/projected/b5faffb9-3444-4db1-869e-e329c8f61648-kube-api-access-fpdhz\") pod \"b5faffb9-3444-4db1-869e-e329c8f61648\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " Apr 23 01:18:30.458483 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458449 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-console-config\") pod \"b5faffb9-3444-4db1-869e-e329c8f61648\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " Apr 23 01:18:30.458678 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458618 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-service-ca" (OuterVolumeSpecName: "service-ca") pod "b5faffb9-3444-4db1-869e-e329c8f61648" (UID: "b5faffb9-3444-4db1-869e-e329c8f61648"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:18:30.458678 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458641 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b5faffb9-3444-4db1-869e-e329c8f61648" (UID: "b5faffb9-3444-4db1-869e-e329c8f61648"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:18:30.458776 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458724 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b5faffb9-3444-4db1-869e-e329c8f61648" (UID: "b5faffb9-3444-4db1-869e-e329c8f61648"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:18:30.458823 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458793 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-oauth-config\") pod \"b5faffb9-3444-4db1-869e-e329c8f61648\" (UID: \"b5faffb9-3444-4db1-869e-e329c8f61648\") " Apr 23 01:18:30.458823 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.458814 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-console-config" (OuterVolumeSpecName: "console-config") pod "b5faffb9-3444-4db1-869e-e329c8f61648" (UID: "b5faffb9-3444-4db1-869e-e329c8f61648"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:18:30.459127 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.459106 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-console-config\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:18:30.459127 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.459127 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-service-ca\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:18:30.459399 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.459137 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-trusted-ca-bundle\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:18:30.459399 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.459149 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5faffb9-3444-4db1-869e-e329c8f61648-oauth-serving-cert\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:18:30.460665 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.460637 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b5faffb9-3444-4db1-869e-e329c8f61648" (UID: "b5faffb9-3444-4db1-869e-e329c8f61648"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:18:30.460763 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.460712 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5faffb9-3444-4db1-869e-e329c8f61648-kube-api-access-fpdhz" (OuterVolumeSpecName: "kube-api-access-fpdhz") pod "b5faffb9-3444-4db1-869e-e329c8f61648" (UID: "b5faffb9-3444-4db1-869e-e329c8f61648"). InnerVolumeSpecName "kube-api-access-fpdhz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:18:30.460817 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.460798 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b5faffb9-3444-4db1-869e-e329c8f61648" (UID: "b5faffb9-3444-4db1-869e-e329c8f61648"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:18:30.560092 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.560047 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-serving-cert\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:18:30.560092 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.560084 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fpdhz\" (UniqueName: \"kubernetes.io/projected/b5faffb9-3444-4db1-869e-e329c8f61648-kube-api-access-fpdhz\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:18:30.560092 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:30.560095 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5faffb9-3444-4db1-869e-e329c8f61648-console-oauth-config\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:18:31.258639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.258593 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" event={"ID":"02a14d76-8ef3-486a-b525-4477611711e7","Type":"ContainerStarted","Data":"22233a63414eb6ff3d634bfdd702315b5c4ef30050f498aebc6b3d73668c706a"} Apr 23 01:18:31.259806 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.259782 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fd49b8965-hclkn_b5faffb9-3444-4db1-869e-e329c8f61648/console/0.log" Apr 23 01:18:31.259942 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.259858 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd49b8965-hclkn" event={"ID":"b5faffb9-3444-4db1-869e-e329c8f61648","Type":"ContainerDied","Data":"5feeb43580e6e566561048fb5d6d25d5ff8f7fec120b946f4c0f902513ea6ddc"} Apr 23 01:18:31.259942 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.259879 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd49b8965-hclkn" Apr 23 01:18:31.259942 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.259894 2565 scope.go:117] "RemoveContainer" containerID="3d12a0fbffa36fec0194641e6d73c9ee559a64e8c1d49217bba78211f602cbd6" Apr 23 01:18:31.275526 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.275481 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8sswq" podStartSLOduration=2.139820962 podStartE2EDuration="28.275464643s" podCreationTimestamp="2026-04-23 01:18:03 +0000 UTC" firstStartedPulling="2026-04-23 01:18:04.078297852 +0000 UTC m=+471.038151290" lastFinishedPulling="2026-04-23 01:18:30.213941516 +0000 UTC m=+497.173794971" observedRunningTime="2026-04-23 01:18:31.273920904 +0000 UTC m=+498.233774365" watchObservedRunningTime="2026-04-23 01:18:31.275464643 +0000 UTC m=+498.235318102" Apr 23 01:18:31.290549 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.290516 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fd49b8965-hclkn"] Apr 23 01:18:31.296017 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.295991 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fd49b8965-hclkn"] Apr 23 01:18:31.560951 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:31.560913 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5faffb9-3444-4db1-869e-e329c8f61648" path="/var/lib/kubelet/pods/b5faffb9-3444-4db1-869e-e329c8f61648/volumes" Apr 23 01:18:57.027127 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.027090 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jpfjz"] Apr 23 01:18:57.027627 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.027469 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5faffb9-3444-4db1-869e-e329c8f61648" containerName="console" Apr 23 01:18:57.027627 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.027489 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5faffb9-3444-4db1-869e-e329c8f61648" containerName="console" Apr 23 01:18:57.027808 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.027630 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5faffb9-3444-4db1-869e-e329c8f61648" containerName="console" Apr 23 01:18:57.029340 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.029320 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" Apr 23 01:18:57.031855 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.031824 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tgddk\"" Apr 23 01:18:57.037247 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.037215 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jpfjz"] Apr 23 01:18:57.037594 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.037568 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvfgv\" (UniqueName: \"kubernetes.io/projected/5f8268a0-fe0a-4c34-a9de-47fd8cefce48-kube-api-access-qvfgv\") pod \"authorino-f99f4b5cd-jpfjz\" (UID: \"5f8268a0-fe0a-4c34-a9de-47fd8cefce48\") " pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" Apr 23 01:18:57.138677 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.138655 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvfgv\" (UniqueName: \"kubernetes.io/projected/5f8268a0-fe0a-4c34-a9de-47fd8cefce48-kube-api-access-qvfgv\") pod \"authorino-f99f4b5cd-jpfjz\" (UID: \"5f8268a0-fe0a-4c34-a9de-47fd8cefce48\") " pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" Apr 23 01:18:57.150220 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.150193 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvfgv\" (UniqueName: \"kubernetes.io/projected/5f8268a0-fe0a-4c34-a9de-47fd8cefce48-kube-api-access-qvfgv\") pod \"authorino-f99f4b5cd-jpfjz\" (UID: \"5f8268a0-fe0a-4c34-a9de-47fd8cefce48\") " pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" Apr 23 01:18:57.154748 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.154729 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-rrwr9"] Apr 23 01:18:57.158890 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.158869 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-rrwr9" Apr 23 01:18:57.164747 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.164726 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-rrwr9"] Apr 23 01:18:57.239651 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.239628 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8wz\" (UniqueName: \"kubernetes.io/projected/f064e922-a752-4acb-a992-31245b84d764-kube-api-access-zd8wz\") pod \"authorino-7498df8756-rrwr9\" (UID: \"f064e922-a752-4acb-a992-31245b84d764\") " pod="kuadrant-system/authorino-7498df8756-rrwr9" Apr 23 01:18:57.340330 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.340288 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" Apr 23 01:18:57.340453 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.340433 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8wz\" (UniqueName: \"kubernetes.io/projected/f064e922-a752-4acb-a992-31245b84d764-kube-api-access-zd8wz\") pod \"authorino-7498df8756-rrwr9\" (UID: \"f064e922-a752-4acb-a992-31245b84d764\") " pod="kuadrant-system/authorino-7498df8756-rrwr9" Apr 23 01:18:57.347654 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.347631 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8wz\" (UniqueName: \"kubernetes.io/projected/f064e922-a752-4acb-a992-31245b84d764-kube-api-access-zd8wz\") pod \"authorino-7498df8756-rrwr9\" (UID: \"f064e922-a752-4acb-a992-31245b84d764\") " pod="kuadrant-system/authorino-7498df8756-rrwr9" Apr 23 01:18:57.453102 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.453075 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jpfjz"] Apr 23 01:18:57.455879 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:18:57.455845 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f8268a0_fe0a_4c34_a9de_47fd8cefce48.slice/crio-1942810f504c148ac6856c6ae16faf0264e4ede4a1f9bc958df77f79ecd728f3 WatchSource:0}: Error finding container 1942810f504c148ac6856c6ae16faf0264e4ede4a1f9bc958df77f79ecd728f3: Status 404 returned error can't find the container with id 1942810f504c148ac6856c6ae16faf0264e4ede4a1f9bc958df77f79ecd728f3 Apr 23 01:18:57.468856 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.468837 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-rrwr9" Apr 23 01:18:57.580500 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:57.580482 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-rrwr9"] Apr 23 01:18:57.582394 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:18:57.582369 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf064e922_a752_4acb_a992_31245b84d764.slice/crio-7d33e1bde09da142c500f45931a5989c299b10db0873df57c0c0c1175bfd33a4 WatchSource:0}: Error finding container 7d33e1bde09da142c500f45931a5989c299b10db0873df57c0c0c1175bfd33a4: Status 404 returned error can't find the container with id 7d33e1bde09da142c500f45931a5989c299b10db0873df57c0c0c1175bfd33a4 Apr 23 01:18:58.363213 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:58.363129 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" event={"ID":"5f8268a0-fe0a-4c34-a9de-47fd8cefce48","Type":"ContainerStarted","Data":"1942810f504c148ac6856c6ae16faf0264e4ede4a1f9bc958df77f79ecd728f3"} Apr 23 01:18:58.366063 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:18:58.366031 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-rrwr9" event={"ID":"f064e922-a752-4acb-a992-31245b84d764","Type":"ContainerStarted","Data":"7d33e1bde09da142c500f45931a5989c299b10db0873df57c0c0c1175bfd33a4"} Apr 23 01:19:01.381301 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:01.381205 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" event={"ID":"5f8268a0-fe0a-4c34-a9de-47fd8cefce48","Type":"ContainerStarted","Data":"10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade"} Apr 23 01:19:01.382592 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:01.382560 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-rrwr9" event={"ID":"f064e922-a752-4acb-a992-31245b84d764","Type":"ContainerStarted","Data":"c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2"} Apr 23 01:19:01.396410 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:01.396356 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" podStartSLOduration=0.877911863 podStartE2EDuration="4.396338322s" podCreationTimestamp="2026-04-23 01:18:57 +0000 UTC" firstStartedPulling="2026-04-23 01:18:57.457171828 +0000 UTC m=+524.417025266" lastFinishedPulling="2026-04-23 01:19:00.975598287 +0000 UTC m=+527.935451725" observedRunningTime="2026-04-23 01:19:01.39416109 +0000 UTC m=+528.354014563" watchObservedRunningTime="2026-04-23 01:19:01.396338322 +0000 UTC m=+528.356191782" Apr 23 01:19:01.408925 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:01.408866 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-rrwr9" podStartSLOduration=1.029887377 podStartE2EDuration="4.408851865s" podCreationTimestamp="2026-04-23 01:18:57 +0000 UTC" firstStartedPulling="2026-04-23 01:18:57.583637296 +0000 UTC m=+524.543490734" lastFinishedPulling="2026-04-23 01:19:00.962601785 +0000 UTC m=+527.922455222" observedRunningTime="2026-04-23 01:19:01.408389599 +0000 UTC m=+528.368243060" watchObservedRunningTime="2026-04-23 01:19:01.408851865 +0000 UTC m=+528.368705324" Apr 23 01:19:01.432725 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:01.432689 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jpfjz"] Apr 23 01:19:03.389039 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:03.388911 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" podUID="5f8268a0-fe0a-4c34-a9de-47fd8cefce48" containerName="authorino" containerID="cri-o://10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade" gracePeriod=30 Apr 23 01:19:03.709369 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:03.709325 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" Apr 23 01:19:03.796222 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:03.796174 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvfgv\" (UniqueName: \"kubernetes.io/projected/5f8268a0-fe0a-4c34-a9de-47fd8cefce48-kube-api-access-qvfgv\") pod \"5f8268a0-fe0a-4c34-a9de-47fd8cefce48\" (UID: \"5f8268a0-fe0a-4c34-a9de-47fd8cefce48\") " Apr 23 01:19:03.798614 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:03.798579 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8268a0-fe0a-4c34-a9de-47fd8cefce48-kube-api-access-qvfgv" (OuterVolumeSpecName: "kube-api-access-qvfgv") pod "5f8268a0-fe0a-4c34-a9de-47fd8cefce48" (UID: "5f8268a0-fe0a-4c34-a9de-47fd8cefce48"). InnerVolumeSpecName "kube-api-access-qvfgv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:19:03.897364 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:03.897297 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvfgv\" (UniqueName: \"kubernetes.io/projected/5f8268a0-fe0a-4c34-a9de-47fd8cefce48-kube-api-access-qvfgv\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:19:04.393087 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.393047 2565 generic.go:358] "Generic (PLEG): container finished" podID="5f8268a0-fe0a-4c34-a9de-47fd8cefce48" containerID="10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade" exitCode=0 Apr 23 01:19:04.393499 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.393098 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" Apr 23 01:19:04.393499 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.393136 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" event={"ID":"5f8268a0-fe0a-4c34-a9de-47fd8cefce48","Type":"ContainerDied","Data":"10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade"} Apr 23 01:19:04.393499 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.393174 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jpfjz" event={"ID":"5f8268a0-fe0a-4c34-a9de-47fd8cefce48","Type":"ContainerDied","Data":"1942810f504c148ac6856c6ae16faf0264e4ede4a1f9bc958df77f79ecd728f3"} Apr 23 01:19:04.393499 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.393190 2565 scope.go:117] "RemoveContainer" containerID="10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade" Apr 23 01:19:04.401778 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.401762 2565 scope.go:117] "RemoveContainer" containerID="10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade" Apr 23 01:19:04.402079 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:19:04.402060 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade\": container with ID starting with 10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade not found: ID does not exist" containerID="10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade" Apr 23 01:19:04.402151 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.402100 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade"} err="failed to get container status \"10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade\": rpc error: code = NotFound desc = could not find container \"10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade\": container with ID starting with 10e5f720fff7b886cbbf77a60cbbd248390e7e17f49946218e558994eef96ade not found: ID does not exist" Apr 23 01:19:04.413753 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.413725 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jpfjz"] Apr 23 01:19:04.415730 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:04.415709 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jpfjz"] Apr 23 01:19:05.560413 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:05.560378 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8268a0-fe0a-4c34-a9de-47fd8cefce48" path="/var/lib/kubelet/pods/5f8268a0-fe0a-4c34-a9de-47fd8cefce48/volumes" Apr 23 01:19:25.766531 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.766498 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wtfn9"] Apr 23 01:19:25.766951 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.766766 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f8268a0-fe0a-4c34-a9de-47fd8cefce48" containerName="authorino" Apr 23 01:19:25.766951 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.766778 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8268a0-fe0a-4c34-a9de-47fd8cefce48" containerName="authorino" Apr 23 01:19:25.766951 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.766839 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f8268a0-fe0a-4c34-a9de-47fd8cefce48" containerName="authorino" Apr 23 01:19:25.770863 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.770846 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" Apr 23 01:19:25.777308 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.777284 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wtfn9"] Apr 23 01:19:25.869148 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.869125 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbsgj\" (UniqueName: \"kubernetes.io/projected/51e3e9db-723d-47cc-8318-ad9686f7497c-kube-api-access-wbsgj\") pod \"authorino-8b475cf9f-wtfn9\" (UID: \"51e3e9db-723d-47cc-8318-ad9686f7497c\") " pod="kuadrant-system/authorino-8b475cf9f-wtfn9" Apr 23 01:19:25.969655 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.969626 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbsgj\" (UniqueName: \"kubernetes.io/projected/51e3e9db-723d-47cc-8318-ad9686f7497c-kube-api-access-wbsgj\") pod \"authorino-8b475cf9f-wtfn9\" (UID: \"51e3e9db-723d-47cc-8318-ad9686f7497c\") " pod="kuadrant-system/authorino-8b475cf9f-wtfn9" Apr 23 01:19:25.981724 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.981704 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbsgj\" (UniqueName: \"kubernetes.io/projected/51e3e9db-723d-47cc-8318-ad9686f7497c-kube-api-access-wbsgj\") pod \"authorino-8b475cf9f-wtfn9\" (UID: \"51e3e9db-723d-47cc-8318-ad9686f7497c\") " pod="kuadrant-system/authorino-8b475cf9f-wtfn9" Apr 23 01:19:25.999917 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:25.999892 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wtfn9"] Apr 23 01:19:26.000090 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.000077 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" Apr 23 01:19:26.027691 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.027640 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5488cb79b5-gpjqd"] Apr 23 01:19:26.031837 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.031820 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5488cb79b5-gpjqd" Apr 23 01:19:26.038139 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.038104 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5488cb79b5-gpjqd"] Apr 23 01:19:26.098023 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.097997 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5488cb79b5-gpjqd"] Apr 23 01:19:26.098256 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:19:26.098228 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xcvlr], unattached volumes=[], failed to process volumes=[kube-api-access-xcvlr]: context canceled" pod="kuadrant-system/authorino-5488cb79b5-gpjqd" podUID="9d21b4cd-3f20-4e2f-af8d-81738f79ca00" Apr 23 01:19:26.126645 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.126611 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-57b76947b9-wxbzs"] Apr 23 01:19:26.129719 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:19:26.129693 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51e3e9db_723d_47cc_8318_ad9686f7497c.slice/crio-09f7031242d416dc6f0557e97f618a8347335978c2663ce0031b155b579ec81e WatchSource:0}: Error finding container 09f7031242d416dc6f0557e97f618a8347335978c2663ce0031b155b579ec81e: Status 404 returned error can't find the container with id 09f7031242d416dc6f0557e97f618a8347335978c2663ce0031b155b579ec81e Apr 23 01:19:26.130218 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.130198 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wtfn9"] Apr 23 01:19:26.130311 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.130287 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:19:26.132959 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.132938 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 23 01:19:26.134160 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.134140 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-57b76947b9-wxbzs"] Apr 23 01:19:26.171969 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.171944 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvlr\" (UniqueName: \"kubernetes.io/projected/9d21b4cd-3f20-4e2f-af8d-81738f79ca00-kube-api-access-xcvlr\") pod \"authorino-5488cb79b5-gpjqd\" (UID: \"9d21b4cd-3f20-4e2f-af8d-81738f79ca00\") " pod="kuadrant-system/authorino-5488cb79b5-gpjqd" Apr 23 01:19:26.273228 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.273203 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a4fd551-ebab-444e-a446-29324301eb9b-tls-cert\") pod \"authorino-57b76947b9-wxbzs\" (UID: \"9a4fd551-ebab-444e-a446-29324301eb9b\") " pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:19:26.273329 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.273234 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqx4c\" (UniqueName: \"kubernetes.io/projected/9a4fd551-ebab-444e-a446-29324301eb9b-kube-api-access-xqx4c\") pod \"authorino-57b76947b9-wxbzs\" (UID: \"9a4fd551-ebab-444e-a446-29324301eb9b\") " pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:19:26.273369 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.273334 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvlr\" (UniqueName: \"kubernetes.io/projected/9d21b4cd-3f20-4e2f-af8d-81738f79ca00-kube-api-access-xcvlr\") pod \"authorino-5488cb79b5-gpjqd\" (UID: \"9d21b4cd-3f20-4e2f-af8d-81738f79ca00\") " pod="kuadrant-system/authorino-5488cb79b5-gpjqd" Apr 23 01:19:26.281283 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.281225 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvlr\" (UniqueName: \"kubernetes.io/projected/9d21b4cd-3f20-4e2f-af8d-81738f79ca00-kube-api-access-xcvlr\") pod \"authorino-5488cb79b5-gpjqd\" (UID: \"9d21b4cd-3f20-4e2f-af8d-81738f79ca00\") " pod="kuadrant-system/authorino-5488cb79b5-gpjqd" Apr 23 01:19:26.374018 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.373995 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a4fd551-ebab-444e-a446-29324301eb9b-tls-cert\") pod \"authorino-57b76947b9-wxbzs\" (UID: \"9a4fd551-ebab-444e-a446-29324301eb9b\") " pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:19:26.374117 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.374026 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqx4c\" (UniqueName: \"kubernetes.io/projected/9a4fd551-ebab-444e-a446-29324301eb9b-kube-api-access-xqx4c\") pod \"authorino-57b76947b9-wxbzs\" (UID: \"9a4fd551-ebab-444e-a446-29324301eb9b\") " pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:19:26.376296 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.376278 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a4fd551-ebab-444e-a446-29324301eb9b-tls-cert\") pod \"authorino-57b76947b9-wxbzs\" (UID: \"9a4fd551-ebab-444e-a446-29324301eb9b\") " pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:19:26.381320 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.381303 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqx4c\" (UniqueName: \"kubernetes.io/projected/9a4fd551-ebab-444e-a446-29324301eb9b-kube-api-access-xqx4c\") pod \"authorino-57b76947b9-wxbzs\" (UID: \"9a4fd551-ebab-444e-a446-29324301eb9b\") " pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:19:26.444547 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.444520 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:19:26.478937 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.478914 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5488cb79b5-gpjqd" Apr 23 01:19:26.479076 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.478910 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" event={"ID":"51e3e9db-723d-47cc-8318-ad9686f7497c","Type":"ContainerStarted","Data":"09f7031242d416dc6f0557e97f618a8347335978c2663ce0031b155b579ec81e"} Apr 23 01:19:26.484441 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.484422 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5488cb79b5-gpjqd" Apr 23 01:19:26.566049 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.566025 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-57b76947b9-wxbzs"] Apr 23 01:19:26.567854 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:19:26.567828 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a4fd551_ebab_444e_a446_29324301eb9b.slice/crio-92d328b74df34d1ffdf48e44f371a6af03d96a9083586b94188191600d1c1e93 WatchSource:0}: Error finding container 92d328b74df34d1ffdf48e44f371a6af03d96a9083586b94188191600d1c1e93: Status 404 returned error can't find the container with id 92d328b74df34d1ffdf48e44f371a6af03d96a9083586b94188191600d1c1e93 Apr 23 01:19:26.576485 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.576464 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcvlr\" (UniqueName: \"kubernetes.io/projected/9d21b4cd-3f20-4e2f-af8d-81738f79ca00-kube-api-access-xcvlr\") pod \"9d21b4cd-3f20-4e2f-af8d-81738f79ca00\" (UID: \"9d21b4cd-3f20-4e2f-af8d-81738f79ca00\") " Apr 23 01:19:26.578520 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.578494 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d21b4cd-3f20-4e2f-af8d-81738f79ca00-kube-api-access-xcvlr" (OuterVolumeSpecName: "kube-api-access-xcvlr") pod "9d21b4cd-3f20-4e2f-af8d-81738f79ca00" (UID: "9d21b4cd-3f20-4e2f-af8d-81738f79ca00"). InnerVolumeSpecName "kube-api-access-xcvlr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:19:26.677201 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:26.677171 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xcvlr\" (UniqueName: \"kubernetes.io/projected/9d21b4cd-3f20-4e2f-af8d-81738f79ca00-kube-api-access-xcvlr\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:19:27.483837 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.483804 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57b76947b9-wxbzs" event={"ID":"9a4fd551-ebab-444e-a446-29324301eb9b","Type":"ContainerStarted","Data":"0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f"} Apr 23 01:19:27.484266 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.483844 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57b76947b9-wxbzs" event={"ID":"9a4fd551-ebab-444e-a446-29324301eb9b","Type":"ContainerStarted","Data":"92d328b74df34d1ffdf48e44f371a6af03d96a9083586b94188191600d1c1e93"} Apr 23 01:19:27.485135 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.485114 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5488cb79b5-gpjqd" Apr 23 01:19:27.485228 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.485170 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" podUID="51e3e9db-723d-47cc-8318-ad9686f7497c" containerName="authorino" containerID="cri-o://33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e" gracePeriod=30 Apr 23 01:19:27.485228 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.485110 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" event={"ID":"51e3e9db-723d-47cc-8318-ad9686f7497c","Type":"ContainerStarted","Data":"33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e"} Apr 23 01:19:27.502963 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.502920 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-57b76947b9-wxbzs" podStartSLOduration=1.119267623 podStartE2EDuration="1.502907893s" podCreationTimestamp="2026-04-23 01:19:26 +0000 UTC" firstStartedPulling="2026-04-23 01:19:26.569630445 +0000 UTC m=+553.529483883" lastFinishedPulling="2026-04-23 01:19:26.953270705 +0000 UTC m=+553.913124153" observedRunningTime="2026-04-23 01:19:27.502215755 +0000 UTC m=+554.462069219" watchObservedRunningTime="2026-04-23 01:19:27.502907893 +0000 UTC m=+554.462761352" Apr 23 01:19:27.518086 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.518047 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" podStartSLOduration=2.159151063 podStartE2EDuration="2.518035323s" podCreationTimestamp="2026-04-23 01:19:25 +0000 UTC" firstStartedPulling="2026-04-23 01:19:26.131047014 +0000 UTC m=+553.090900452" lastFinishedPulling="2026-04-23 01:19:26.489931251 +0000 UTC m=+553.449784712" observedRunningTime="2026-04-23 01:19:27.515487723 +0000 UTC m=+554.475341188" watchObservedRunningTime="2026-04-23 01:19:27.518035323 +0000 UTC m=+554.477888783" Apr 23 01:19:27.532726 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.532703 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-rrwr9"] Apr 23 01:19:27.533166 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.533115 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-rrwr9" podUID="f064e922-a752-4acb-a992-31245b84d764" containerName="authorino" containerID="cri-o://c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2" gracePeriod=30 Apr 23 01:19:27.546683 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.546661 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5488cb79b5-gpjqd"] Apr 23 01:19:27.548830 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.548809 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5488cb79b5-gpjqd"] Apr 23 01:19:27.560233 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.560210 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d21b4cd-3f20-4e2f-af8d-81738f79ca00" path="/var/lib/kubelet/pods/9d21b4cd-3f20-4e2f-af8d-81738f79ca00/volumes" Apr 23 01:19:27.797611 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.797590 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" Apr 23 01:19:27.801512 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.801493 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-rrwr9" Apr 23 01:19:27.893507 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.893471 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd8wz\" (UniqueName: \"kubernetes.io/projected/f064e922-a752-4acb-a992-31245b84d764-kube-api-access-zd8wz\") pod \"f064e922-a752-4acb-a992-31245b84d764\" (UID: \"f064e922-a752-4acb-a992-31245b84d764\") " Apr 23 01:19:27.893644 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.893567 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbsgj\" (UniqueName: \"kubernetes.io/projected/51e3e9db-723d-47cc-8318-ad9686f7497c-kube-api-access-wbsgj\") pod \"51e3e9db-723d-47cc-8318-ad9686f7497c\" (UID: \"51e3e9db-723d-47cc-8318-ad9686f7497c\") " Apr 23 01:19:27.895531 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.895507 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f064e922-a752-4acb-a992-31245b84d764-kube-api-access-zd8wz" (OuterVolumeSpecName: "kube-api-access-zd8wz") pod "f064e922-a752-4acb-a992-31245b84d764" (UID: "f064e922-a752-4acb-a992-31245b84d764"). InnerVolumeSpecName "kube-api-access-zd8wz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:19:27.895613 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.895528 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e3e9db-723d-47cc-8318-ad9686f7497c-kube-api-access-wbsgj" (OuterVolumeSpecName: "kube-api-access-wbsgj") pod "51e3e9db-723d-47cc-8318-ad9686f7497c" (UID: "51e3e9db-723d-47cc-8318-ad9686f7497c"). InnerVolumeSpecName "kube-api-access-wbsgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:19:27.994205 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.994184 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbsgj\" (UniqueName: \"kubernetes.io/projected/51e3e9db-723d-47cc-8318-ad9686f7497c-kube-api-access-wbsgj\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:19:27.994288 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:27.994206 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zd8wz\" (UniqueName: \"kubernetes.io/projected/f064e922-a752-4acb-a992-31245b84d764-kube-api-access-zd8wz\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:19:28.489725 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.489690 2565 generic.go:358] "Generic (PLEG): container finished" podID="51e3e9db-723d-47cc-8318-ad9686f7497c" containerID="33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e" exitCode=0 Apr 23 01:19:28.490192 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.489737 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" Apr 23 01:19:28.490192 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.489773 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" event={"ID":"51e3e9db-723d-47cc-8318-ad9686f7497c","Type":"ContainerDied","Data":"33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e"} Apr 23 01:19:28.490192 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.489820 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-wtfn9" event={"ID":"51e3e9db-723d-47cc-8318-ad9686f7497c","Type":"ContainerDied","Data":"09f7031242d416dc6f0557e97f618a8347335978c2663ce0031b155b579ec81e"} Apr 23 01:19:28.490192 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.489842 2565 scope.go:117] "RemoveContainer" containerID="33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e" Apr 23 01:19:28.490964 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.490943 2565 generic.go:358] "Generic (PLEG): container finished" podID="f064e922-a752-4acb-a992-31245b84d764" containerID="c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2" exitCode=0 Apr 23 01:19:28.491046 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.490992 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-rrwr9" event={"ID":"f064e922-a752-4acb-a992-31245b84d764","Type":"ContainerDied","Data":"c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2"} Apr 23 01:19:28.491046 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.491005 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-rrwr9" Apr 23 01:19:28.491046 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.491024 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-rrwr9" event={"ID":"f064e922-a752-4acb-a992-31245b84d764","Type":"ContainerDied","Data":"7d33e1bde09da142c500f45931a5989c299b10db0873df57c0c0c1175bfd33a4"} Apr 23 01:19:28.499548 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.499382 2565 scope.go:117] "RemoveContainer" containerID="33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e" Apr 23 01:19:28.499710 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:19:28.499685 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e\": container with ID starting with 33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e not found: ID does not exist" containerID="33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e" Apr 23 01:19:28.499787 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.499717 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e"} err="failed to get container status \"33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e\": rpc error: code = NotFound desc = could not find container \"33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e\": container with ID starting with 33161581883b8682b0511ae93f0152ebb1a57ef04d7099ec61f82e088818bd0e not found: ID does not exist" Apr 23 01:19:28.499787 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.499739 2565 scope.go:117] "RemoveContainer" containerID="c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2" Apr 23 01:19:28.509422 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.509401 2565 scope.go:117] "RemoveContainer" containerID="c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2" Apr 23 01:19:28.509841 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:19:28.509819 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2\": container with ID starting with c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2 not found: ID does not exist" containerID="c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2" Apr 23 01:19:28.509909 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.509852 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2"} err="failed to get container status \"c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2\": rpc error: code = NotFound desc = could not find container \"c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2\": container with ID starting with c7769698659d1cc414933e502049ecbcfd8cce5bc7bc64ed8c904e14b28a76d2 not found: ID does not exist" Apr 23 01:19:28.520905 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.520842 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wtfn9"] Apr 23 01:19:28.523673 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.523634 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wtfn9"] Apr 23 01:19:28.532848 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.532813 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-rrwr9"] Apr 23 01:19:28.539269 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:28.539238 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-rrwr9"] Apr 23 01:19:29.560288 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:29.560256 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e3e9db-723d-47cc-8318-ad9686f7497c" path="/var/lib/kubelet/pods/51e3e9db-723d-47cc-8318-ad9686f7497c/volumes" Apr 23 01:19:29.560659 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:19:29.560564 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f064e922-a752-4acb-a992-31245b84d764" path="/var/lib/kubelet/pods/f064e922-a752-4acb-a992-31245b84d764/volumes" Apr 23 01:20:30.660735 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.660704 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9"] Apr 23 01:20:30.661206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.660997 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f064e922-a752-4acb-a992-31245b84d764" containerName="authorino" Apr 23 01:20:30.661206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.661008 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f064e922-a752-4acb-a992-31245b84d764" containerName="authorino" Apr 23 01:20:30.661206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.661025 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51e3e9db-723d-47cc-8318-ad9686f7497c" containerName="authorino" Apr 23 01:20:30.661206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.661030 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e3e9db-723d-47cc-8318-ad9686f7497c" containerName="authorino" Apr 23 01:20:30.661206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.661078 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="51e3e9db-723d-47cc-8318-ad9686f7497c" containerName="authorino" Apr 23 01:20:30.661206 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.661087 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="f064e922-a752-4acb-a992-31245b84d764" containerName="authorino" Apr 23 01:20:30.664110 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.664091 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.666563 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.666541 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 23 01:20:30.668083 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.668059 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-2r2t2\"" Apr 23 01:20:30.668197 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.668097 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 23 01:20:30.668197 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.668064 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 23 01:20:30.671674 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.671640 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9"] Apr 23 01:20:30.804923 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.804894 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2ld6\" (UniqueName: \"kubernetes.io/projected/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-kube-api-access-m2ld6\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.804923 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.804935 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.805171 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.804967 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.805171 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.805028 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.805171 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.805057 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.805171 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.805088 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906169 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906135 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2ld6\" (UniqueName: \"kubernetes.io/projected/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-kube-api-access-m2ld6\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906317 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906174 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906317 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906206 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906393 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906372 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906427 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906410 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906473 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906438 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906833 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906810 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906909 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906876 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.906909 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.906897 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.908634 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.908615 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.908800 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.908782 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.913272 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.913218 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2ld6\" (UniqueName: \"kubernetes.io/projected/04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a-kube-api-access-m2ld6\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9\" (UID: \"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:30.974870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:30.974844 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:31.096307 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:31.096284 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9"] Apr 23 01:20:31.098039 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:20:31.098013 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04e6ea4d_29d1_43ec_833b_51a1c0cf8a8a.slice/crio-9cf9750638c76f82a49dbe9be66dd2ebc12b1f93b708b321ae14da6928f24e24 WatchSource:0}: Error finding container 9cf9750638c76f82a49dbe9be66dd2ebc12b1f93b708b321ae14da6928f24e24: Status 404 returned error can't find the container with id 9cf9750638c76f82a49dbe9be66dd2ebc12b1f93b708b321ae14da6928f24e24 Apr 23 01:20:31.099597 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:31.099577 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:20:31.725937 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:31.725905 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" event={"ID":"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a","Type":"ContainerStarted","Data":"9cf9750638c76f82a49dbe9be66dd2ebc12b1f93b708b321ae14da6928f24e24"} Apr 23 01:20:36.752242 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:36.752200 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" event={"ID":"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a","Type":"ContainerStarted","Data":"5f7607893a8fed3d9b393b1d23f6139ba749d7ee589d6f5906a39bc9fb8bf764"} Apr 23 01:20:41.770188 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:41.770159 2565 generic.go:358] "Generic (PLEG): container finished" podID="04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a" containerID="5f7607893a8fed3d9b393b1d23f6139ba749d7ee589d6f5906a39bc9fb8bf764" exitCode=0 Apr 23 01:20:41.770518 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:41.770238 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" event={"ID":"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a","Type":"ContainerDied","Data":"5f7607893a8fed3d9b393b1d23f6139ba749d7ee589d6f5906a39bc9fb8bf764"} Apr 23 01:20:43.778797 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:43.778762 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" event={"ID":"04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a","Type":"ContainerStarted","Data":"0c585690a456b7f10f4a907331a4fd69cf44c2fd8d85ed2a805499d9b77ee161"} Apr 23 01:20:43.779218 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:43.778992 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:43.797314 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:43.797270 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" podStartSLOduration=2.013655789 podStartE2EDuration="13.797259556s" podCreationTimestamp="2026-04-23 01:20:30 +0000 UTC" firstStartedPulling="2026-04-23 01:20:31.099717896 +0000 UTC m=+618.059571333" lastFinishedPulling="2026-04-23 01:20:42.883321654 +0000 UTC m=+629.843175100" observedRunningTime="2026-04-23 01:20:43.795074117 +0000 UTC m=+630.754927589" watchObservedRunningTime="2026-04-23 01:20:43.797259556 +0000 UTC m=+630.757113015" Apr 23 01:20:50.868608 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:50.868578 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5"] Apr 23 01:20:50.875663 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:50.875644 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:50.878017 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:50.877996 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 23 01:20:50.879609 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:50.879588 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5"] Apr 23 01:20:51.063792 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.063763 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.063971 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.063821 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.063971 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.063873 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.063971 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.063897 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24affa34-978f-4872-90ee-a0e0e1716a75-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.063971 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.063925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.063971 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.063949 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4g69\" (UniqueName: \"kubernetes.io/projected/24affa34-978f-4872-90ee-a0e0e1716a75-kube-api-access-v4g69\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164287 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164266 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164399 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164307 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164399 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164325 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24affa34-978f-4872-90ee-a0e0e1716a75-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164527 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164505 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164569 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164548 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4g69\" (UniqueName: \"kubernetes.io/projected/24affa34-978f-4872-90ee-a0e0e1716a75-kube-api-access-v4g69\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164621 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164608 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164780 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164758 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164867 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164838 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.164918 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.164885 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.166485 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.166469 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24affa34-978f-4872-90ee-a0e0e1716a75-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.166890 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.166872 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24affa34-978f-4872-90ee-a0e0e1716a75-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.171804 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.171785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4g69\" (UniqueName: \"kubernetes.io/projected/24affa34-978f-4872-90ee-a0e0e1716a75-kube-api-access-v4g69\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tjpg5\" (UID: \"24affa34-978f-4872-90ee-a0e0e1716a75\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.186658 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.186639 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:51.303174 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.303142 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5"] Apr 23 01:20:51.305109 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:20:51.305082 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24affa34_978f_4872_90ee_a0e0e1716a75.slice/crio-cde4c813c881df73cf6455af7e1797e10d1f181eae80ff6e55530af02ae8ea2a WatchSource:0}: Error finding container cde4c813c881df73cf6455af7e1797e10d1f181eae80ff6e55530af02ae8ea2a: Status 404 returned error can't find the container with id cde4c813c881df73cf6455af7e1797e10d1f181eae80ff6e55530af02ae8ea2a Apr 23 01:20:51.807313 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.807272 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" event={"ID":"24affa34-978f-4872-90ee-a0e0e1716a75","Type":"ContainerStarted","Data":"6481cb9b1f38170059e07228409c81e196ca98608abc9fabc24909151e61cce3"} Apr 23 01:20:51.807313 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:51.807315 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" event={"ID":"24affa34-978f-4872-90ee-a0e0e1716a75","Type":"ContainerStarted","Data":"cde4c813c881df73cf6455af7e1797e10d1f181eae80ff6e55530af02ae8ea2a"} Apr 23 01:20:54.797148 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:54.797108 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9" Apr 23 01:20:56.826252 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:56.826177 2565 generic.go:358] "Generic (PLEG): container finished" podID="24affa34-978f-4872-90ee-a0e0e1716a75" containerID="6481cb9b1f38170059e07228409c81e196ca98608abc9fabc24909151e61cce3" exitCode=0 Apr 23 01:20:56.826662 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:56.826257 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" event={"ID":"24affa34-978f-4872-90ee-a0e0e1716a75","Type":"ContainerDied","Data":"6481cb9b1f38170059e07228409c81e196ca98608abc9fabc24909151e61cce3"} Apr 23 01:20:57.831137 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:57.831094 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" event={"ID":"24affa34-978f-4872-90ee-a0e0e1716a75","Type":"ContainerStarted","Data":"398142947f857b8e803b3f8a66dc11e41752afa96c950c7a4ed9387f26665089"} Apr 23 01:20:57.831618 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:57.831336 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:20:57.850454 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:20:57.850407 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" podStartSLOduration=7.636023726 podStartE2EDuration="7.850392783s" podCreationTimestamp="2026-04-23 01:20:50 +0000 UTC" firstStartedPulling="2026-04-23 01:20:56.826931032 +0000 UTC m=+643.786784471" lastFinishedPulling="2026-04-23 01:20:57.041300089 +0000 UTC m=+644.001153528" observedRunningTime="2026-04-23 01:20:57.847327972 +0000 UTC m=+644.807181432" watchObservedRunningTime="2026-04-23 01:20:57.850392783 +0000 UTC m=+644.810246242" Apr 23 01:21:05.871137 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.871105 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8"] Apr 23 01:21:05.874231 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.874210 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:05.876629 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.876607 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 23 01:21:05.888275 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.888249 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8"] Apr 23 01:21:05.977151 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.977125 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:05.977255 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.977157 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:05.977255 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.977178 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:05.977255 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.977214 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:05.977355 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.977283 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jd2v\" (UniqueName: \"kubernetes.io/projected/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-kube-api-access-2jd2v\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:05.977355 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:05.977315 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078085 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jd2v\" (UniqueName: \"kubernetes.io/projected/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-kube-api-access-2jd2v\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078198 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078098 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078198 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078121 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078198 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078156 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078198 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078193 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078399 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078218 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078638 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078613 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078702 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078637 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.078748 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.078693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.080326 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.080305 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.080486 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.080468 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.085493 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.085472 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jd2v\" (UniqueName: \"kubernetes.io/projected/bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8-kube-api-access-2jd2v\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8\" (UID: \"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.190074 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.190054 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:06.305765 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.305741 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8"] Apr 23 01:21:06.306440 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:21:06.306412 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf749cf7_d1d0_4b39_b37b_63e43c3a6ec8.slice/crio-893c297662f22cae7328d00355451d7a1c5429bbf5665ef5dd49252dbd522dc3 WatchSource:0}: Error finding container 893c297662f22cae7328d00355451d7a1c5429bbf5665ef5dd49252dbd522dc3: Status 404 returned error can't find the container with id 893c297662f22cae7328d00355451d7a1c5429bbf5665ef5dd49252dbd522dc3 Apr 23 01:21:06.864534 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.864484 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" event={"ID":"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8","Type":"ContainerStarted","Data":"72e2e641d5aff5fed853fb585b7b6ac6851e7daaae37437ea6744c3f1c714c71"} Apr 23 01:21:06.864534 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:06.864534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" event={"ID":"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8","Type":"ContainerStarted","Data":"893c297662f22cae7328d00355451d7a1c5429bbf5665ef5dd49252dbd522dc3"} Apr 23 01:21:08.852851 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:08.852823 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tjpg5" Apr 23 01:21:11.884011 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:11.883934 2565 generic.go:358] "Generic (PLEG): container finished" podID="bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8" containerID="72e2e641d5aff5fed853fb585b7b6ac6851e7daaae37437ea6744c3f1c714c71" exitCode=0 Apr 23 01:21:11.884293 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:11.884013 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" event={"ID":"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8","Type":"ContainerDied","Data":"72e2e641d5aff5fed853fb585b7b6ac6851e7daaae37437ea6744c3f1c714c71"} Apr 23 01:21:12.890414 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:12.890373 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" event={"ID":"bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8","Type":"ContainerStarted","Data":"5432f4220fbf0c87ea65eac3058ec8ff8533c41567cb15506efcf21f649bb777"} Apr 23 01:21:12.891437 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:12.891412 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:12.909531 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:12.909491 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" podStartSLOduration=7.717031424 podStartE2EDuration="7.909478294s" podCreationTimestamp="2026-04-23 01:21:05 +0000 UTC" firstStartedPulling="2026-04-23 01:21:11.884572071 +0000 UTC m=+658.844425509" lastFinishedPulling="2026-04-23 01:21:12.077018938 +0000 UTC m=+659.036872379" observedRunningTime="2026-04-23 01:21:12.906792393 +0000 UTC m=+659.866645853" watchObservedRunningTime="2026-04-23 01:21:12.909478294 +0000 UTC m=+659.869331753" Apr 23 01:21:23.906302 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:23.906268 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8" Apr 23 01:21:47.818961 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:47.818928 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6fdf5b9964-b2dg6"] Apr 23 01:21:47.822122 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:47.822106 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" Apr 23 01:21:47.829551 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:47.829524 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6fdf5b9964-b2dg6"] Apr 23 01:21:47.968401 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:47.968374 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6hts\" (UniqueName: \"kubernetes.io/projected/f57d64d6-9628-4b18-b41f-fcbe9632b765-kube-api-access-j6hts\") pod \"authorino-6fdf5b9964-b2dg6\" (UID: \"f57d64d6-9628-4b18-b41f-fcbe9632b765\") " pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" Apr 23 01:21:47.968524 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:47.968406 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f57d64d6-9628-4b18-b41f-fcbe9632b765-tls-cert\") pod \"authorino-6fdf5b9964-b2dg6\" (UID: \"f57d64d6-9628-4b18-b41f-fcbe9632b765\") " pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" Apr 23 01:21:48.069040 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:48.068958 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f57d64d6-9628-4b18-b41f-fcbe9632b765-tls-cert\") pod \"authorino-6fdf5b9964-b2dg6\" (UID: \"f57d64d6-9628-4b18-b41f-fcbe9632b765\") " pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" Apr 23 01:21:48.069146 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:48.069052 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6hts\" (UniqueName: \"kubernetes.io/projected/f57d64d6-9628-4b18-b41f-fcbe9632b765-kube-api-access-j6hts\") pod \"authorino-6fdf5b9964-b2dg6\" (UID: \"f57d64d6-9628-4b18-b41f-fcbe9632b765\") " pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" Apr 23 01:21:48.071226 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:48.071205 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f57d64d6-9628-4b18-b41f-fcbe9632b765-tls-cert\") pod \"authorino-6fdf5b9964-b2dg6\" (UID: \"f57d64d6-9628-4b18-b41f-fcbe9632b765\") " pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" Apr 23 01:21:48.076495 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:48.076475 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6hts\" (UniqueName: \"kubernetes.io/projected/f57d64d6-9628-4b18-b41f-fcbe9632b765-kube-api-access-j6hts\") pod \"authorino-6fdf5b9964-b2dg6\" (UID: \"f57d64d6-9628-4b18-b41f-fcbe9632b765\") " pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" Apr 23 01:21:48.132141 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:48.132120 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" Apr 23 01:21:48.253595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:48.253566 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6fdf5b9964-b2dg6"] Apr 23 01:21:48.254872 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:21:48.254846 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf57d64d6_9628_4b18_b41f_fcbe9632b765.slice/crio-5e14e5cb4ca77d0533153a039e34a4ee240c217659d05bcd1c4b7deeba5ddbb0 WatchSource:0}: Error finding container 5e14e5cb4ca77d0533153a039e34a4ee240c217659d05bcd1c4b7deeba5ddbb0: Status 404 returned error can't find the container with id 5e14e5cb4ca77d0533153a039e34a4ee240c217659d05bcd1c4b7deeba5ddbb0 Apr 23 01:21:49.017443 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.017363 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" event={"ID":"f57d64d6-9628-4b18-b41f-fcbe9632b765","Type":"ContainerStarted","Data":"345bc1adc6b17aaa43df53f6b23fc8fab36fe2c2328e823d894d546aa5a173e1"} Apr 23 01:21:49.017443 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.017404 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" event={"ID":"f57d64d6-9628-4b18-b41f-fcbe9632b765","Type":"ContainerStarted","Data":"5e14e5cb4ca77d0533153a039e34a4ee240c217659d05bcd1c4b7deeba5ddbb0"} Apr 23 01:21:49.033433 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.033383 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6fdf5b9964-b2dg6" podStartSLOduration=1.5820286810000002 podStartE2EDuration="2.033367794s" podCreationTimestamp="2026-04-23 01:21:47 +0000 UTC" firstStartedPulling="2026-04-23 01:21:48.25618413 +0000 UTC m=+695.216037567" lastFinishedPulling="2026-04-23 01:21:48.707523239 +0000 UTC m=+695.667376680" observedRunningTime="2026-04-23 01:21:49.032606656 +0000 UTC m=+695.992460116" watchObservedRunningTime="2026-04-23 01:21:49.033367794 +0000 UTC m=+695.993221252" Apr 23 01:21:49.057640 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.057604 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57b76947b9-wxbzs"] Apr 23 01:21:49.057887 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.057856 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-57b76947b9-wxbzs" podUID="9a4fd551-ebab-444e-a446-29324301eb9b" containerName="authorino" containerID="cri-o://0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f" gracePeriod=30 Apr 23 01:21:49.297260 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.297233 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:21:49.382055 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.382028 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a4fd551-ebab-444e-a446-29324301eb9b-tls-cert\") pod \"9a4fd551-ebab-444e-a446-29324301eb9b\" (UID: \"9a4fd551-ebab-444e-a446-29324301eb9b\") " Apr 23 01:21:49.382197 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.382063 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqx4c\" (UniqueName: \"kubernetes.io/projected/9a4fd551-ebab-444e-a446-29324301eb9b-kube-api-access-xqx4c\") pod \"9a4fd551-ebab-444e-a446-29324301eb9b\" (UID: \"9a4fd551-ebab-444e-a446-29324301eb9b\") " Apr 23 01:21:49.383942 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.383914 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a4fd551-ebab-444e-a446-29324301eb9b-kube-api-access-xqx4c" (OuterVolumeSpecName: "kube-api-access-xqx4c") pod "9a4fd551-ebab-444e-a446-29324301eb9b" (UID: "9a4fd551-ebab-444e-a446-29324301eb9b"). InnerVolumeSpecName "kube-api-access-xqx4c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:21:49.390503 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.390484 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4fd551-ebab-444e-a446-29324301eb9b-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "9a4fd551-ebab-444e-a446-29324301eb9b" (UID: "9a4fd551-ebab-444e-a446-29324301eb9b"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:21:49.483200 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.483174 2565 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a4fd551-ebab-444e-a446-29324301eb9b-tls-cert\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:21:49.483290 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:49.483200 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqx4c\" (UniqueName: \"kubernetes.io/projected/9a4fd551-ebab-444e-a446-29324301eb9b-kube-api-access-xqx4c\") on node \"ip-10-0-135-74.ec2.internal\" DevicePath \"\"" Apr 23 01:21:50.024538 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.024508 2565 generic.go:358] "Generic (PLEG): container finished" podID="9a4fd551-ebab-444e-a446-29324301eb9b" containerID="0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f" exitCode=0 Apr 23 01:21:50.024883 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.024558 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b76947b9-wxbzs" Apr 23 01:21:50.024883 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.024591 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57b76947b9-wxbzs" event={"ID":"9a4fd551-ebab-444e-a446-29324301eb9b","Type":"ContainerDied","Data":"0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f"} Apr 23 01:21:50.024883 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.024629 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57b76947b9-wxbzs" event={"ID":"9a4fd551-ebab-444e-a446-29324301eb9b","Type":"ContainerDied","Data":"92d328b74df34d1ffdf48e44f371a6af03d96a9083586b94188191600d1c1e93"} Apr 23 01:21:50.024883 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.024647 2565 scope.go:117] "RemoveContainer" containerID="0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f" Apr 23 01:21:50.032509 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.032498 2565 scope.go:117] "RemoveContainer" containerID="0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f" Apr 23 01:21:50.032722 ip-10-0-135-74 kubenswrapper[2565]: E0423 01:21:50.032703 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f\": container with ID starting with 0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f not found: ID does not exist" containerID="0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f" Apr 23 01:21:50.032771 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.032727 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f"} err="failed to get container status \"0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f\": rpc error: code = NotFound desc = could not find container \"0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f\": container with ID starting with 0d7ff7378283fd4b323c35baeb7158b4ef16ddde24bf65f262e4b41ff02bf51f not found: ID does not exist" Apr 23 01:21:50.041115 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.041093 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57b76947b9-wxbzs"] Apr 23 01:21:50.044995 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:50.044961 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-57b76947b9-wxbzs"] Apr 23 01:21:51.559719 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:21:51.559684 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a4fd551-ebab-444e-a446-29324301eb9b" path="/var/lib/kubelet/pods/9a4fd551-ebab-444e-a446-29324301eb9b/volumes" Apr 23 01:44:04.931775 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:04.931740 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6fdf5b9964-b2dg6_f57d64d6-9628-4b18-b41f-fcbe9632b765/authorino/0.log" Apr 23 01:44:09.657598 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:09.657566 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5fb5768b86-cxw88_d9c99aae-b547-4c5d-b3cc-648cbdd109b2/manager/0.log" Apr 23 01:44:10.751543 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.751508 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d_c4534e35-eb66-433b-9d99-6c8855233dfa/util/0.log" Apr 23 01:44:10.758227 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.758199 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d_c4534e35-eb66-433b-9d99-6c8855233dfa/pull/0.log" Apr 23 01:44:10.763315 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.763298 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d_c4534e35-eb66-433b-9d99-6c8855233dfa/extract/0.log" Apr 23 01:44:10.867314 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.867272 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d_11c5f374-0168-48de-b916-7038711c16c8/util/0.log" Apr 23 01:44:10.873016 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.872968 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d_11c5f374-0168-48de-b916-7038711c16c8/pull/0.log" Apr 23 01:44:10.879159 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.879128 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d_11c5f374-0168-48de-b916-7038711c16c8/extract/0.log" Apr 23 01:44:10.983668 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.983637 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6_d1cd9850-356a-4f75-83b7-40ec06bf50bd/extract/0.log" Apr 23 01:44:10.989116 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.989084 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6_d1cd9850-356a-4f75-83b7-40ec06bf50bd/util/0.log" Apr 23 01:44:10.994639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:10.994611 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6_d1cd9850-356a-4f75-83b7-40ec06bf50bd/pull/0.log" Apr 23 01:44:11.098529 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:11.098454 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l_1e52617f-6882-4d96-8047-3acbbcef7a99/extract/0.log" Apr 23 01:44:11.103871 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:11.103851 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l_1e52617f-6882-4d96-8047-3acbbcef7a99/util/0.log" Apr 23 01:44:11.109311 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:11.109290 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l_1e52617f-6882-4d96-8047-3acbbcef7a99/pull/0.log" Apr 23 01:44:11.222736 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:11.222701 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6fdf5b9964-b2dg6_f57d64d6-9628-4b18-b41f-fcbe9632b765/authorino/0.log" Apr 23 01:44:11.545373 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:11.545340 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-8sswq_02a14d76-8ef3-486a-b525-4477611711e7/kuadrant-console-plugin/0.log" Apr 23 01:44:12.554896 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:12.554862 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-8596599875-gnq79_693a7cef-37fb-4989-a8cc-6ae494d989a1/kube-auth-proxy/0.log" Apr 23 01:44:12.888419 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:12.888325 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78df469446-xzwmg_cf49adfe-57aa-40fc-9a7b-4156e87b2ad7/router/0.log" Apr 23 01:44:13.319624 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:13.319543 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8_bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8/storage-initializer/0.log" Apr 23 01:44:13.325673 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:13.325652 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-vkvt8_bf749cf7-d1d0-4b39-b37b-63e43c3a6ec8/main/0.log" Apr 23 01:44:13.436486 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:13.436425 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-tjpg5_24affa34-978f-4872-90ee-a0e0e1716a75/storage-initializer/0.log" Apr 23 01:44:13.442870 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:13.442845 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-tjpg5_24affa34-978f-4872-90ee-a0e0e1716a75/main/0.log" Apr 23 01:44:13.660373 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:13.660345 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9_04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a/storage-initializer/0.log" Apr 23 01:44:13.667041 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:13.667017 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-rx9j9_04e6ea4d-29d1-43ec-833b-51a1c0cf8a8a/main/0.log" Apr 23 01:44:20.530040 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:20.530001 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qvzx4_c5b97054-a7d6-400f-9526-e28785d300ef/global-pull-secret-syncer/0.log" Apr 23 01:44:20.652991 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:20.652952 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xg5xx_37eabc47-849b-40f9-bac1-f73b5c3d4329/konnectivity-agent/0.log" Apr 23 01:44:20.673813 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:20.673784 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-74.ec2.internal_62aaf4ba8ebf9ff33dbeeea8017ddd63/haproxy/0.log" Apr 23 01:44:24.335439 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.335411 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d_c4534e35-eb66-433b-9d99-6c8855233dfa/extract/0.log" Apr 23 01:44:24.362513 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.362477 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d_c4534e35-eb66-433b-9d99-6c8855233dfa/util/0.log" Apr 23 01:44:24.391746 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.391713 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759nsn8d_c4534e35-eb66-433b-9d99-6c8855233dfa/pull/0.log" Apr 23 01:44:24.426907 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.426876 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d_11c5f374-0168-48de-b916-7038711c16c8/extract/0.log" Apr 23 01:44:24.450605 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.450568 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d_11c5f374-0168-48de-b916-7038711c16c8/util/0.log" Apr 23 01:44:24.482052 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.482019 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0d8p8d_11c5f374-0168-48de-b916-7038711c16c8/pull/0.log" Apr 23 01:44:24.523345 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.523317 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6_d1cd9850-356a-4f75-83b7-40ec06bf50bd/extract/0.log" Apr 23 01:44:24.548080 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.548049 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6_d1cd9850-356a-4f75-83b7-40ec06bf50bd/util/0.log" Apr 23 01:44:24.575397 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.575366 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73w2pl6_d1cd9850-356a-4f75-83b7-40ec06bf50bd/pull/0.log" Apr 23 01:44:24.606367 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.606281 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l_1e52617f-6882-4d96-8047-3acbbcef7a99/extract/0.log" Apr 23 01:44:24.630438 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.630403 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l_1e52617f-6882-4d96-8047-3acbbcef7a99/util/0.log" Apr 23 01:44:24.654593 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.654568 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1gnf9l_1e52617f-6882-4d96-8047-3acbbcef7a99/pull/0.log" Apr 23 01:44:24.831885 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.831852 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6fdf5b9964-b2dg6_f57d64d6-9628-4b18-b41f-fcbe9632b765/authorino/0.log" Apr 23 01:44:24.939986 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:24.939957 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-8sswq_02a14d76-8ef3-486a-b525-4477611711e7/kuadrant-console-plugin/0.log" Apr 23 01:44:26.518413 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:26.518380 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3259f911-f547-46dd-8b53-171fdc52f63a/alertmanager/0.log" Apr 23 01:44:26.545719 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:26.545687 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3259f911-f547-46dd-8b53-171fdc52f63a/config-reloader/0.log" Apr 23 01:44:26.566737 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:26.566709 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3259f911-f547-46dd-8b53-171fdc52f63a/kube-rbac-proxy-web/0.log" Apr 23 01:44:26.599631 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:26.599607 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3259f911-f547-46dd-8b53-171fdc52f63a/kube-rbac-proxy/0.log" Apr 23 01:44:26.633730 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:26.633702 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3259f911-f547-46dd-8b53-171fdc52f63a/kube-rbac-proxy-metric/0.log" Apr 23 01:44:26.657352 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:26.657318 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3259f911-f547-46dd-8b53-171fdc52f63a/prom-label-proxy/0.log" Apr 23 01:44:26.683789 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:26.683764 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3259f911-f547-46dd-8b53-171fdc52f63a/init-config-reloader/0.log" Apr 23 01:44:27.066479 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:27.066452 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-svn7q_66e0905b-e676-4e53-a97a-bfbf17b1c22d/node-exporter/0.log" Apr 23 01:44:27.087027 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:27.086997 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-svn7q_66e0905b-e676-4e53-a97a-bfbf17b1c22d/kube-rbac-proxy/0.log" Apr 23 01:44:27.109095 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:27.109067 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-svn7q_66e0905b-e676-4e53-a97a-bfbf17b1c22d/init-textfile/0.log" Apr 23 01:44:29.360779 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.360737 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl"] Apr 23 01:44:29.361288 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.361116 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a4fd551-ebab-444e-a446-29324301eb9b" containerName="authorino" Apr 23 01:44:29.361288 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.361129 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4fd551-ebab-444e-a446-29324301eb9b" containerName="authorino" Apr 23 01:44:29.361288 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.361194 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a4fd551-ebab-444e-a446-29324301eb9b" containerName="authorino" Apr 23 01:44:29.364146 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.364127 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.366885 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.366856 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6hxp5\"/\"openshift-service-ca.crt\"" Apr 23 01:44:29.367027 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.367001 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6hxp5\"/\"kube-root-ca.crt\"" Apr 23 01:44:29.368595 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.368573 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6hxp5\"/\"default-dockercfg-rb8bq\"" Apr 23 01:44:29.371738 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.371347 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl"] Apr 23 01:44:29.506631 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.506590 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-proc\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.506631 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.506637 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-sys\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.506851 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.506720 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbpl\" (UniqueName: \"kubernetes.io/projected/efdf8f05-fdbe-4f18-90ea-871c5158cc59-kube-api-access-ddbpl\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.506851 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.506749 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-lib-modules\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.506851 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.506769 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-podres\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608198 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608155 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-proc\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608366 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608210 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-sys\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608366 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608259 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbpl\" (UniqueName: \"kubernetes.io/projected/efdf8f05-fdbe-4f18-90ea-871c5158cc59-kube-api-access-ddbpl\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608366 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608276 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-lib-modules\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608366 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608282 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-proc\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608366 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608296 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-podres\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608366 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608341 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-sys\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608559 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608421 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-podres\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.608559 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.608420 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/efdf8f05-fdbe-4f18-90ea-871c5158cc59-lib-modules\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.616249 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.616175 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbpl\" (UniqueName: \"kubernetes.io/projected/efdf8f05-fdbe-4f18-90ea-871c5158cc59-kube-api-access-ddbpl\") pod \"perf-node-gather-daemonset-8ktkl\" (UID: \"efdf8f05-fdbe-4f18-90ea-871c5158cc59\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.674824 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.674781 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:29.801046 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.801011 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl"] Apr 23 01:44:29.802390 ip-10-0-135-74 kubenswrapper[2565]: W0423 01:44:29.802358 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podefdf8f05_fdbe_4f18_90ea_871c5158cc59.slice/crio-33119c345756e89bfe985ead21943ab8a65788fd5e8306709bb8758c546e1ed3 WatchSource:0}: Error finding container 33119c345756e89bfe985ead21943ab8a65788fd5e8306709bb8758c546e1ed3: Status 404 returned error can't find the container with id 33119c345756e89bfe985ead21943ab8a65788fd5e8306709bb8758c546e1ed3 Apr 23 01:44:29.804135 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:29.804115 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:44:30.648830 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:30.648782 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" event={"ID":"efdf8f05-fdbe-4f18-90ea-871c5158cc59","Type":"ContainerStarted","Data":"b208298eee62a44151bedc22334dd9afd5e1ab8416f149a081edc0ceec8619cd"} Apr 23 01:44:30.648830 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:30.648821 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" event={"ID":"efdf8f05-fdbe-4f18-90ea-871c5158cc59","Type":"ContainerStarted","Data":"33119c345756e89bfe985ead21943ab8a65788fd5e8306709bb8758c546e1ed3"} Apr 23 01:44:30.649284 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:30.648918 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:30.669314 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:30.669261 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" podStartSLOduration=1.6692457090000001 podStartE2EDuration="1.669245709s" podCreationTimestamp="2026-04-23 01:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:44:30.666611932 +0000 UTC m=+2057.626465393" watchObservedRunningTime="2026-04-23 01:44:30.669245709 +0000 UTC m=+2057.629099168" Apr 23 01:44:31.408148 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:31.408091 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-czwb8_2efc1102-e677-4cde-b6a8-5304536665ad/dns/0.log" Apr 23 01:44:31.435759 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:31.435726 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-czwb8_2efc1102-e677-4cde-b6a8-5304536665ad/kube-rbac-proxy/0.log" Apr 23 01:44:31.599718 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:31.599689 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-twhzp_100ffad2-0adc-4293-8bc1-c64fdc753f08/dns-node-resolver/0.log" Apr 23 01:44:32.061709 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:32.061677 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-86f775b4f-hdvw2_73635124-a810-416a-8482-00ba10f2ad6e/registry/0.log" Apr 23 01:44:32.069144 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:32.069121 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-86f775b4f-hdvw2_73635124-a810-416a-8482-00ba10f2ad6e/registry/1.log" Apr 23 01:44:32.140937 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:32.140899 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rbjs6_318c5767-f4ad-4937-bccb-ef0c86ed7ff7/node-ca/0.log" Apr 23 01:44:33.144592 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:33.144561 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-8596599875-gnq79_693a7cef-37fb-4989-a8cc-6ae494d989a1/kube-auth-proxy/0.log" Apr 23 01:44:33.286490 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:33.286445 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78df469446-xzwmg_cf49adfe-57aa-40fc-9a7b-4156e87b2ad7/router/0.log" Apr 23 01:44:33.821847 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:33.821816 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p8psz_9342de38-1c25-496b-b531-420cff35d1e6/serve-healthcheck-canary/0.log" Apr 23 01:44:34.290538 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:34.290500 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6q2sj_dc212b92-93e6-442a-ba12-b470f57b4965/kube-rbac-proxy/0.log" Apr 23 01:44:34.309956 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:34.309925 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6q2sj_dc212b92-93e6-442a-ba12-b470f57b4965/exporter/0.log" Apr 23 01:44:34.329777 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:34.329733 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6q2sj_dc212b92-93e6-442a-ba12-b470f57b4965/extractor/0.log" Apr 23 01:44:36.463639 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:36.463592 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5fb5768b86-cxw88_d9c99aae-b547-4c5d-b3cc-648cbdd109b2/manager/0.log" Apr 23 01:44:36.663007 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:36.662963 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-8ktkl" Apr 23 01:44:37.742406 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:37.742375 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6b799cbd77-jtwlz_a5b48147-74ff-45de-b9df-251ff995fea3/manager/0.log" Apr 23 01:44:43.522066 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:43.522031 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4vqs_54d9175d-7498-4b1f-8e42-2c7b5a37d2f4/kube-multus-additional-cni-plugins/0.log" Apr 23 01:44:43.543823 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:43.543787 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4vqs_54d9175d-7498-4b1f-8e42-2c7b5a37d2f4/egress-router-binary-copy/0.log" Apr 23 01:44:43.564101 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:43.564066 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4vqs_54d9175d-7498-4b1f-8e42-2c7b5a37d2f4/cni-plugins/0.log" Apr 23 01:44:43.585735 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:43.585709 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4vqs_54d9175d-7498-4b1f-8e42-2c7b5a37d2f4/bond-cni-plugin/0.log" Apr 23 01:44:43.605375 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:43.605344 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4vqs_54d9175d-7498-4b1f-8e42-2c7b5a37d2f4/routeoverride-cni/0.log" Apr 23 01:44:43.626363 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:43.626337 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4vqs_54d9175d-7498-4b1f-8e42-2c7b5a37d2f4/whereabouts-cni-bincopy/0.log" Apr 23 01:44:43.648749 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:43.648715 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4vqs_54d9175d-7498-4b1f-8e42-2c7b5a37d2f4/whereabouts-cni/0.log" Apr 23 01:44:44.033580 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:44.033552 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rgvmw_09a44761-f9fa-463b-86c4-dacaca4d17a8/kube-multus/0.log" Apr 23 01:44:44.133936 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:44.133879 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ps42z_14afdf01-fa2e-4563-8fbf-0cc2613b39ba/network-metrics-daemon/0.log" Apr 23 01:44:44.152095 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:44.152069 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ps42z_14afdf01-fa2e-4563-8fbf-0cc2613b39ba/kube-rbac-proxy/0.log" Apr 23 01:44:45.529715 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:45.529678 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt5wx_6a5144da-faff-4d89-b9c9-baf899b2a716/ovn-controller/0.log" Apr 23 01:44:45.554878 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:45.554842 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt5wx_6a5144da-faff-4d89-b9c9-baf899b2a716/ovn-acl-logging/0.log" Apr 23 01:44:45.571686 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:45.571657 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt5wx_6a5144da-faff-4d89-b9c9-baf899b2a716/kube-rbac-proxy-node/0.log" Apr 23 01:44:45.592905 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:45.592874 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt5wx_6a5144da-faff-4d89-b9c9-baf899b2a716/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 01:44:45.612412 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:45.612301 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt5wx_6a5144da-faff-4d89-b9c9-baf899b2a716/northd/0.log" Apr 23 01:44:45.634842 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:45.634814 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt5wx_6a5144da-faff-4d89-b9c9-baf899b2a716/nbdb/0.log" Apr 23 01:44:45.663885 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:45.663858 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt5wx_6a5144da-faff-4d89-b9c9-baf899b2a716/sbdb/0.log" Apr 23 01:44:45.757128 ip-10-0-135-74 kubenswrapper[2565]: I0423 01:44:45.757089 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xt5wx_6a5144da-faff-4d89-b9c9-baf899b2a716/ovnkube-controller/0.log"