Apr 21 04:38:57.266686 ip-10-0-135-122 systemd[1]: Starting Kubernetes Kubelet... Apr 21 04:38:57.649427 ip-10-0-135-122 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:38:57.649427 ip-10-0-135-122 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 04:38:57.649427 ip-10-0-135-122 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:38:57.649427 ip-10-0-135-122 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 04:38:57.649427 ip-10-0-135-122 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:38:57.651089 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.650997 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 04:38:57.656609 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656585 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:57.656609 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656605 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:57.656609 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656609 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:57.656609 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656612 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:57.656609 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656616 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:57.656609 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656619 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656622 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656625 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656627 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656630 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656634 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656638 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656641 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656644 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656646 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656649 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656651 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656654 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656657 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656660 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656662 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656665 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656668 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656670 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:57.656836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656673 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656676 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656678 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656681 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656687 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656689 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656693 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656696 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656698 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656701 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656704 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656706 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656709 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656712 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656715 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656718 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656720 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656723 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656726 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656728 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:57.657284 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656731 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656733 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656736 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656739 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656742 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656745 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656747 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656750 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656752 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656755 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656757 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656760 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656764 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656768 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656771 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656774 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656777 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656780 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656783 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656787 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:57.657814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656790 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656793 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656796 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656800 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656802 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656805 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656808 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656811 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656814 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656817 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656819 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656823 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656826 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656829 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656832 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656835 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656837 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656840 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656843 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:57.658390 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656845 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656848 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.656851 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657243 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657248 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657252 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657255 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657258 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657261 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657264 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657267 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657272 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657275 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657278 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657281 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657284 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657287 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657289 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657292 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:57.658850 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657295 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657298 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657300 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657303 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657306 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657308 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657310 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657314 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657318 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657321 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657324 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657327 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657330 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657332 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657335 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657337 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657339 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657342 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657344 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657347 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:57.659342 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657350 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657353 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657355 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657358 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657376 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657379 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657382 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657384 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657387 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657390 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657392 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657395 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657397 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657400 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657402 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657405 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657407 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657410 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657412 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657415 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:57.659866 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657418 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657428 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657431 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657434 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657436 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657439 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657441 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657443 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657446 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657448 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657451 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657453 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657456 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657459 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657462 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657464 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657467 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657470 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657473 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:57.660357 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657475 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657478 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657480 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657483 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657485 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657489 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657491 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657494 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657496 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657499 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.657502 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658720 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658734 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658742 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658746 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658751 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658755 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658760 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658764 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658768 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658771 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 04:38:57.660836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658774 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658777 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658780 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658783 2579 flags.go:64] FLAG: --cgroup-root="" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658787 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658790 2579 flags.go:64] FLAG: --client-ca-file="" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658793 2579 flags.go:64] FLAG: --cloud-config="" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658795 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658799 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658803 2579 flags.go:64] FLAG: --cluster-domain="" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658806 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658809 2579 flags.go:64] FLAG: --config-dir="" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658811 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658814 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658818 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658821 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658824 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658828 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658831 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658834 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658837 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658840 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658843 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658847 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658850 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 04:38:57.661340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658853 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658856 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658860 2579 flags.go:64] FLAG: --enable-server="true" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658863 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658868 2579 flags.go:64] FLAG: --event-burst="100" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658871 2579 flags.go:64] FLAG: --event-qps="50" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658875 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658879 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658882 2579 flags.go:64] FLAG: --eviction-hard="" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658885 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658888 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658892 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658895 2579 flags.go:64] FLAG: --eviction-soft="" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658898 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658901 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658904 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658907 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658910 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658912 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658915 2579 flags.go:64] FLAG: --feature-gates="" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658919 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658922 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658925 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658928 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658931 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 21 04:38:57.661947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658934 2579 flags.go:64] FLAG: --help="false" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658937 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-135-122.ec2.internal" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658940 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658943 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658946 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658949 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658953 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658955 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658958 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658961 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658964 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658967 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658971 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658973 2579 flags.go:64] FLAG: --kube-reserved="" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658976 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658979 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658982 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658984 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658988 2579 flags.go:64] FLAG: --lock-file="" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658990 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658994 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.658997 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659003 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 04:38:57.662565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659006 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659009 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659012 2579 flags.go:64] FLAG: --logging-format="text" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659014 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659018 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659021 2579 flags.go:64] FLAG: --manifest-url="" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659024 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659028 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659031 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659035 2579 flags.go:64] FLAG: --max-pods="110" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659038 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659040 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659043 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659047 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659050 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659053 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659056 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659064 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659067 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659070 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659074 2579 flags.go:64] FLAG: --pod-cidr="" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659077 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659083 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659086 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 04:38:57.663099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659089 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659092 2579 flags.go:64] FLAG: --port="10250" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659095 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659098 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c445f591917c67c5" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659102 2579 flags.go:64] FLAG: --qos-reserved="" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659105 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659108 2579 flags.go:64] FLAG: --register-node="true" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659111 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659114 2579 flags.go:64] FLAG: --register-with-taints="" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659118 2579 flags.go:64] FLAG: --registry-burst="10" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659121 2579 flags.go:64] FLAG: --registry-qps="5" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659123 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659126 2579 flags.go:64] FLAG: --reserved-memory="" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659130 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659133 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659136 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659139 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659141 2579 flags.go:64] FLAG: --runonce="false" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659144 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659147 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659150 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659153 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659156 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659159 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659162 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659165 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 04:38:57.663732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659167 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659170 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659173 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659176 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659179 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659183 2579 flags.go:64] FLAG: --system-cgroups="" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659185 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659191 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659194 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659197 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659201 2579 flags.go:64] FLAG: --tls-min-version="" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659203 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659206 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659209 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659212 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659215 2579 flags.go:64] FLAG: --v="2" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659219 2579 flags.go:64] FLAG: --version="false" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659223 2579 flags.go:64] FLAG: --vmodule="" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659227 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.659231 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659326 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659330 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659333 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659336 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:57.664420 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659338 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659341 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659344 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659348 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659351 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659354 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659357 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659373 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659376 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659382 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659385 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659388 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659391 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659394 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659397 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659400 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659402 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659405 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659408 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:57.665001 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659411 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659413 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659416 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659419 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659422 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659424 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659427 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659429 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659432 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659435 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659437 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659440 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659442 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659445 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659447 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659450 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659453 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659455 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659458 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659460 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:57.665517 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659463 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659465 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659469 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659472 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659475 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659477 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659482 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659485 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659488 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659490 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659492 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659495 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659498 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659501 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659503 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659505 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659510 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659513 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659516 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659519 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:57.666015 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659521 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659524 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659527 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659530 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659532 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659535 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659537 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659540 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659542 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659545 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659547 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659550 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659552 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659555 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659559 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659561 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659564 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659566 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659570 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659574 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:57.666568 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659576 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:57.667087 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659579 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:57.667087 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.659581 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:57.667087 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.660195 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:38:57.668147 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.668127 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 04:38:57.668181 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.668149 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 04:38:57.668208 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668199 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:57.668208 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668206 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668220 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668224 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668228 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668231 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668234 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668237 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668240 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668243 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668246 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668249 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668251 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668255 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668257 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668260 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668262 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668265 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668269 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668272 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668274 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:57.668276 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668278 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668281 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668284 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668301 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668305 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668309 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668312 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668315 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668317 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668321 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668325 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668328 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668330 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668333 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668335 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668338 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668340 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668343 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668346 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668348 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:57.668814 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668351 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668353 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668356 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668359 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668377 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668380 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668383 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668385 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668388 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668391 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668393 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668396 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668399 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668401 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668405 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668408 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668410 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668413 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668416 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:57.669294 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668419 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668421 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668424 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668426 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668429 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668432 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668435 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668437 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668440 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668442 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668445 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668447 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668450 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668452 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668455 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668457 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668460 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668462 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668465 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668467 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:57.669776 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668470 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668473 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668476 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668478 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668481 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668483 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.668489 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668603 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668608 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668611 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668614 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668617 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668619 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668622 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668625 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668627 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:57.670248 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668630 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668635 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668638 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668640 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668643 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668645 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668648 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668650 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668653 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668655 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668660 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668664 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668667 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668670 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668672 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668675 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668678 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668680 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668683 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:57.670649 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668686 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668688 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668691 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668694 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668697 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668700 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668703 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668705 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668708 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668710 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668713 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668717 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668720 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668722 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668725 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668728 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668731 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668734 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668736 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:57.671104 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668739 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668741 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668744 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668746 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668749 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668752 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668754 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668757 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668759 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668761 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668764 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668766 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668769 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668771 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668775 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668777 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668780 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668783 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668785 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668788 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:57.671674 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668791 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668793 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668796 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668798 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668801 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668804 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668806 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668809 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668811 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668815 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668817 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668820 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668823 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668825 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668828 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668831 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668833 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668836 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:57.672151 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:57.668838 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:57.672608 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.668843 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:38:57.672608 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.669573 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 04:38:57.672608 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.671782 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 04:38:57.672694 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.672653 2579 server.go:1019] "Starting client certificate rotation" Apr 21 04:38:57.672758 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.672744 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:38:57.672788 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.672781 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:38:57.692063 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.692042 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:38:57.694721 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.694702 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:38:57.708609 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.708584 2579 log.go:25] "Validated CRI v1 runtime API" Apr 21 04:38:57.714546 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.714529 2579 log.go:25] "Validated CRI v1 image API" Apr 21 04:38:57.715897 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.715857 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 04:38:57.719586 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.719566 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a025e37a-fa70-4ab1-9ea8-50092ce6b72b:/dev/nvme0n1p4 c71ee834-140c-4cb6-925a-0a711abf8d75:/dev/nvme0n1p3] Apr 21 04:38:57.719652 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.719585 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 04:38:57.722691 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.722674 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:38:57.725028 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.724919 2579 manager.go:217] Machine: {Timestamp:2026-04-21 04:38:57.723267373 +0000 UTC m=+0.349944042 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096954 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a936a8419bb1dbbb7dd649657c1be SystemUUID:ec2a936a-8419-bb1d-bbb7-dd649657c1be BootID:a131e130-5cb6-42f1-b15d-2cf7f9fe55db Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d6:9b:77:d4:a5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d6:9b:77:d4:a5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:da:b5:56:bc:1c:e8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 04:38:57.725028 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.725024 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 04:38:57.725137 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.725109 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 04:38:57.726167 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.726142 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 04:38:57.726304 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.726168 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-122.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 04:38:57.726346 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.726313 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 04:38:57.726346 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.726321 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 04:38:57.726346 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.726333 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:38:57.727869 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.727857 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:38:57.729538 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.729528 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:38:57.729651 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.729642 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 04:38:57.731348 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.731338 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 21 04:38:57.731414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.731359 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 04:38:57.731414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.731392 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 04:38:57.731414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.731403 2579 kubelet.go:397] "Adding apiserver pod source" Apr 21 04:38:57.731414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.731415 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 04:38:57.732519 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.732508 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:38:57.732571 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.732526 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:38:57.735030 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.735014 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 04:38:57.736563 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.736550 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 04:38:57.738104 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738091 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 04:38:57.738104 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738108 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738115 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738121 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738128 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738134 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738150 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738157 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738164 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738169 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738185 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 04:38:57.738203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.738193 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 04:38:57.739604 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.739591 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 04:38:57.739653 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.739607 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 04:38:57.741778 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.741747 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-122.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 04:38:57.741858 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.741803 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 04:38:57.743426 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.743413 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 04:38:57.743473 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.743448 2579 server.go:1295] "Started kubelet" Apr 21 04:38:57.743575 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.743528 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 04:38:57.743611 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.743530 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 04:38:57.743650 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.743620 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 04:38:57.744256 ip-10-0-135-122 systemd[1]: Started Kubernetes Kubelet. Apr 21 04:38:57.746293 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.746272 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 04:38:57.747196 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.747171 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9chjt" Apr 21 04:38:57.748197 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.748174 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 21 04:38:57.752092 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.752075 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9chjt" Apr 21 04:38:57.752789 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.752772 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-122.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 04:38:57.753555 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.752714 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-122.ec2.internal.18a845616d2c5094 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-122.ec2.internal,UID:ip-10-0-135-122.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-122.ec2.internal,},FirstTimestamp:2026-04-21 04:38:57.743425684 +0000 UTC m=+0.370102353,LastTimestamp:2026-04-21 04:38:57.743425684 +0000 UTC m=+0.370102353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-122.ec2.internal,}" Apr 21 04:38:57.753627 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.753574 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 04:38:57.754405 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.754388 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 04:38:57.755584 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.754935 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 04:38:57.756667 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.756645 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 04:38:57.756766 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.756692 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 04:38:57.756766 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.756707 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 04:38:57.756766 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.756704 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:57.756901 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.756788 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 21 04:38:57.756901 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.756795 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 21 04:38:57.757321 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.757302 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 04:38:57.757321 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.757321 2579 factory.go:55] Registering systemd factory Apr 21 04:38:57.757475 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.757330 2579 factory.go:223] Registration of the systemd container factory successfully Apr 21 04:38:57.757565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.757551 2579 factory.go:153] Registering CRI-O factory Apr 21 04:38:57.757622 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.757569 2579 factory.go:223] Registration of the crio container factory successfully Apr 21 04:38:57.757622 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.757593 2579 factory.go:103] Registering Raw factory Apr 21 04:38:57.757622 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.757608 2579 manager.go:1196] Started watching for new ooms in manager Apr 21 04:38:57.758201 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.758177 2579 manager.go:319] Starting recovery of all containers Apr 21 04:38:57.761976 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.761953 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:57.764716 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.764690 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-122.ec2.internal\" not found" node="ip-10-0-135-122.ec2.internal" Apr 21 04:38:57.768732 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.768575 2579 manager.go:324] Recovery completed Apr 21 04:38:57.772840 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.772827 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:57.775076 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.775061 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:57.775140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.775086 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:57.775140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.775096 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:57.775592 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.775580 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 04:38:57.775592 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.775592 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 04:38:57.775696 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.775610 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:38:57.778350 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.778337 2579 policy_none.go:49] "None policy: Start" Apr 21 04:38:57.778442 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.778355 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 04:38:57.778442 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.778384 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 21 04:38:57.808939 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.808921 2579 manager.go:341] "Starting Device Plugin manager" Apr 21 04:38:57.809056 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.808965 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 04:38:57.809056 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.808980 2579 server.go:85] "Starting device plugin registration server" Apr 21 04:38:57.809245 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.809233 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 04:38:57.809295 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.809249 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 04:38:57.809347 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.809333 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 04:38:57.809534 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.809454 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 04:38:57.809534 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.809463 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 04:38:57.822968 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.810029 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 04:38:57.822968 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.810086 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:57.890408 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.890375 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 04:38:57.891598 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.891576 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 04:38:57.891720 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.891605 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 04:38:57.891720 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.891626 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 04:38:57.891720 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.891633 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 04:38:57.891720 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.891671 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 04:38:57.893982 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.893955 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:57.909943 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.909926 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:57.910756 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.910740 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:57.910834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.910768 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:57.910834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.910778 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:57.910834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.910803 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-122.ec2.internal" Apr 21 04:38:57.917961 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.917945 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-122.ec2.internal" Apr 21 04:38:57.918025 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.917967 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-122.ec2.internal\": node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:57.939916 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:57.939896 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:57.992183 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.992150 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal"] Apr 21 04:38:57.992335 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.992253 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:57.993247 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.993233 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:57.993346 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.993265 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:57.993346 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.993280 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:57.994606 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.994582 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:57.994735 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.994720 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:57.994786 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.994754 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:57.996380 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.996347 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:57.996473 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.996390 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:57.996473 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.996400 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:57.996473 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.996347 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:57.996473 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.996472 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:57.996600 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.996484 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:57.997649 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.997636 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" Apr 21 04:38:57.997715 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.997659 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:57.998395 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.998379 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:57.998484 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.998405 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:57.998484 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:57.998415 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:58.016934 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.016917 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-122.ec2.internal\" not found" node="ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.020984 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.020968 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-122.ec2.internal\" not found" node="ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.040178 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.040157 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.141148 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.141103 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.158502 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.158431 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/da09097ea442c891eed521830fa29838-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal\" (UID: \"da09097ea442c891eed521830fa29838\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.158502 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.158469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da09097ea442c891eed521830fa29838-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal\" (UID: \"da09097ea442c891eed521830fa29838\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.158502 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.158490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22c8428fe5241945337c215cc12a9733-config\") pod \"kube-apiserver-proxy-ip-10-0-135-122.ec2.internal\" (UID: \"22c8428fe5241945337c215cc12a9733\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.241829 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.241799 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.259154 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.259128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/da09097ea442c891eed521830fa29838-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal\" (UID: \"da09097ea442c891eed521830fa29838\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.259250 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.259162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da09097ea442c891eed521830fa29838-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal\" (UID: \"da09097ea442c891eed521830fa29838\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.259250 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.259187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22c8428fe5241945337c215cc12a9733-config\") pod \"kube-apiserver-proxy-ip-10-0-135-122.ec2.internal\" (UID: \"22c8428fe5241945337c215cc12a9733\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.259250 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.259238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/22c8428fe5241945337c215cc12a9733-config\") pod \"kube-apiserver-proxy-ip-10-0-135-122.ec2.internal\" (UID: \"22c8428fe5241945337c215cc12a9733\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.259380 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.259244 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da09097ea442c891eed521830fa29838-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal\" (UID: \"da09097ea442c891eed521830fa29838\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.259380 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.259244 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/da09097ea442c891eed521830fa29838-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal\" (UID: \"da09097ea442c891eed521830fa29838\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.319287 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.319254 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.324221 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.324195 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" Apr 21 04:38:58.342496 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.342471 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.443137 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.443036 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.543548 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.543509 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.644074 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.644043 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.672498 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.672474 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 04:38:58.672925 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.672632 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:38:58.672925 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.672645 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:38:58.745054 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.745020 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.754781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.754760 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 04:38:58.754781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.754767 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 04:33:57 +0000 UTC" deadline="2027-11-21 02:58:55.869964571 +0000 UTC" Apr 21 04:38:58.754916 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.754790 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13894h19m57.115177938s" Apr 21 04:38:58.770803 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.770779 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:58.773568 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.773547 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:38:58.788657 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.788636 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qrfcb" Apr 21 04:38:58.796718 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.796696 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qrfcb" Apr 21 04:38:58.810377 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:58.810331 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22c8428fe5241945337c215cc12a9733.slice/crio-d4dcb53195e695511e703c9b22e752e1862763ab798df050fd89701d166c0fd0 WatchSource:0}: Error finding container d4dcb53195e695511e703c9b22e752e1862763ab798df050fd89701d166c0fd0: Status 404 returned error can't find the container with id d4dcb53195e695511e703c9b22e752e1862763ab798df050fd89701d166c0fd0 Apr 21 04:38:58.810836 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:38:58.810818 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda09097ea442c891eed521830fa29838.slice/crio-da133e8f789f879c92e5858dbf3bb31ccc560ae8a82f794a66f96900baa603b9 WatchSource:0}: Error finding container da133e8f789f879c92e5858dbf3bb31ccc560ae8a82f794a66f96900baa603b9: Status 404 returned error can't find the container with id da133e8f789f879c92e5858dbf3bb31ccc560ae8a82f794a66f96900baa603b9 Apr 21 04:38:58.814109 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.814095 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:38:58.845830 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.845800 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:58.895254 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.895208 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" event={"ID":"da09097ea442c891eed521830fa29838","Type":"ContainerStarted","Data":"da133e8f789f879c92e5858dbf3bb31ccc560ae8a82f794a66f96900baa603b9"} Apr 21 04:38:58.896141 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:58.896117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" event={"ID":"22c8428fe5241945337c215cc12a9733","Type":"ContainerStarted","Data":"d4dcb53195e695511e703c9b22e752e1862763ab798df050fd89701d166c0fd0"} Apr 21 04:38:58.946318 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:58.946278 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:59.046902 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.046815 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-122.ec2.internal\" not found" Apr 21 04:38:59.073773 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.073746 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:59.156946 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.156918 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" Apr 21 04:38:59.167189 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.167169 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:38:59.168127 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.168102 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" Apr 21 04:38:59.176896 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.176874 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:38:59.578910 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.578877 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:59.733474 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.733442 2579 apiserver.go:52] "Watching apiserver" Apr 21 04:38:59.740973 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.740948 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 04:38:59.743156 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.743127 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5x8r5","openshift-multus/network-metrics-daemon-jxwc7","kube-system/konnectivity-agent-djdfl","openshift-cluster-node-tuning-operator/tuned-bg69d","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal","openshift-multus/multus-additional-cni-plugins-hvz8m","openshift-multus/multus-wkps9","openshift-network-diagnostics/network-check-target-chcg6","openshift-network-operator/iptables-alerter-t58lb","openshift-ovn-kubernetes/ovnkube-node-dv4qj","kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44","openshift-dns/node-resolver-db4v4"] Apr 21 04:38:59.745233 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.745211 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.746251 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.746229 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:38:59.746353 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.746296 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:38:59.747280 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.747259 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.747780 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.747760 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 04:38:59.747883 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.747766 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 04:38:59.747883 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.747811 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 04:38:59.747991 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.747934 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 04:38:59.747991 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.747953 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ckwn4\"" Apr 21 04:38:59.748495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.748473 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.749523 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.749347 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:38:59.749523 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.749455 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.749724 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.749702 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 04:38:59.750204 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.749961 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qk6l9\"" Apr 21 04:38:59.750682 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.750663 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 04:38:59.750682 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.750677 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 04:38:59.750822 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.750749 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:38:59.750822 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.750802 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:38:59.750951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.750930 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5p9gd\"" Apr 21 04:38:59.751418 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.751400 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 04:38:59.751512 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.751492 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 04:38:59.751870 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.751851 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 04:38:59.751969 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.751852 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jhc9c\"" Apr 21 04:38:59.751969 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.751957 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.753261 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.752998 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.754170 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.754152 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:38:59.754611 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.754590 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 04:38:59.754707 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.754646 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 04:38:59.754707 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.754657 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-smttk\"" Apr 21 04:38:59.754707 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.754574 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:38:59.755010 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.754990 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 04:38:59.755285 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.755264 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 04:38:59.755481 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.755465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.756911 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.756832 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vkg9t\"" Apr 21 04:38:59.757748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.757060 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 04:38:59.757748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.757168 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 04:38:59.757748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.757209 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 04:38:59.757748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.757426 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6kmv9\"" Apr 21 04:38:59.757748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.757571 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 04:38:59.757748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.757684 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 04:38:59.758473 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.758454 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 04:38:59.758561 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.758530 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 04:38:59.758628 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.758610 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dmpdz\"" Apr 21 04:38:59.759126 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.758879 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 04:38:59.759126 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.759060 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.759126 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.759110 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 04:38:59.761177 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.761129 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 04:38:59.761303 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.761285 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 04:38:59.761451 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.761432 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vz5cd\"" Apr 21 04:38:59.767827 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.767803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.767926 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.767878 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-node-log\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.767926 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.767911 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-registration-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.768051 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.767940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:38:59.768051 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.767991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c67kt\" (UniqueName: \"kubernetes.io/projected/6b281d59-c062-4407-95da-057a82e47cba-kube-api-access-c67kt\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.768051 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768026 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7chth\" (UniqueName: \"kubernetes.io/projected/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-kube-api-access-7chth\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.768208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768055 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-system-cni-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.768208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768084 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-kubelet\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.768208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768114 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-cni-binary-copy\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.768208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768138 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-iptables-alerter-script\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.768208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768159 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-host\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.768208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768182 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-systemd\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.768208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768229 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-cni-bin\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768269 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-env-overrides\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b281d59-c062-4407-95da-057a82e47cba-ovn-node-metrics-cert\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-serviceca\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkqpp\" (UniqueName: \"kubernetes.io/projected/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-kube-api-access-mkqpp\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768390 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-run-ovn-kubernetes\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768429 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-kubernetes\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768457 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysctl-d\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-lib-modules\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768505 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8xs6\" (UniqueName: \"kubernetes.io/projected/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-kube-api-access-q8xs6\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768532 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-slash\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.768573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768556 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-ovnkube-script-lib\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysctl-conf\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768604 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-run\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768626 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3250d03-5ab0-4acf-8145-601ce40b14a2-agent-certs\") pod \"konnectivity-agent-djdfl\" (UID: \"c3250d03-5ab0-4acf-8145-601ce40b14a2\") " pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768650 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-multus-certs\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768673 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-var-lib-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768689 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-var-lib-kubelet\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768725 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-hosts-file\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-cnibin\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lxc\" (UniqueName: \"kubernetes.io/projected/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-kube-api-access-l7lxc\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-cni-netd\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768915 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-device-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.768976 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-modprobe-d\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-tuned\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggdpl\" (UniqueName: \"kubernetes.io/projected/65169960-4a16-4850-a073-5d9addbb46e9-kube-api-access-ggdpl\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.769249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769063 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-cni-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-netns\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-daemon-config\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-ovn\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px5qw\" (UniqueName: \"kubernetes.io/projected/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-kube-api-access-px5qw\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-cnibin\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769228 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4crl\" (UniqueName: \"kubernetes.io/projected/506082c4-3364-48e7-a27f-927f2729dde4-kube-api-access-h4crl\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769253 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-kubelet\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769275 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-ovnkube-config\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769318 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-sys-fs\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769381 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-k8s-cni-cncf-io\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769410 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769473 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-os-release\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769525 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-run-netns\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.769902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769583 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysconfig\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769605 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3250d03-5ab0-4acf-8145-601ce40b14a2-konnectivity-ca\") pod \"konnectivity-agent-djdfl\" (UID: \"c3250d03-5ab0-4acf-8145-601ce40b14a2\") " pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769634 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-os-release\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769657 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-hostroot\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769693 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lbj\" (UniqueName: \"kubernetes.io/projected/e5103329-ae63-4574-9dcc-140804f95f79-kube-api-access-26lbj\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769736 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-tmp\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769798 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-socket-dir-parent\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-cni-multus\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-host-slash\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-etc-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-host\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769943 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-socket-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.769984 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-cni-bin\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-etc-kubernetes\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770048 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-system-cni-dir\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-systemd-units\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.770639 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-systemd\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.771439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-log-socket\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.771439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-sys\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.771439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-tmp-dir\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.771439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770194 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-cni-binary-copy\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.771439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.770220 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-conf-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.797401 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.797355 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:33:58 +0000 UTC" deadline="2027-11-21 19:42:36.885670443 +0000 UTC" Apr 21 04:38:59.797401 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.797399 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13911h3m37.088274868s" Apr 21 04:38:59.858044 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.857973 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 04:38:59.871317 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-ovn\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.871491 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px5qw\" (UniqueName: \"kubernetes.io/projected/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-kube-api-access-px5qw\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.871491 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871354 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.871491 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871397 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-cnibin\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.871491 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4crl\" (UniqueName: \"kubernetes.io/projected/506082c4-3364-48e7-a27f-927f2729dde4-kube-api-access-h4crl\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.871491 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871461 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-kubelet\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.871491 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-ovnkube-config\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871517 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-sys-fs\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-ovn\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871564 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-k8s-cni-cncf-io\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-cnibin\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871734 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-os-release\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871750 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-sys-fs\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.871781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871783 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-run-netns\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-k8s-cni-cncf-io\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871846 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysconfig\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3250d03-5ab0-4acf-8145-601ce40b14a2-konnectivity-ca\") pod \"konnectivity-agent-djdfl\" (UID: \"c3250d03-5ab0-4acf-8145-601ce40b14a2\") " pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-os-release\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871938 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-hostroot\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.871957 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26lbj\" (UniqueName: \"kubernetes.io/projected/e5103329-ae63-4574-9dcc-140804f95f79-kube-api-access-26lbj\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871986 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.871995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-tmp\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-socket-dir-parent\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872067 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-cni-multus\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872078 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-os-release\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872091 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-host-slash\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.872130 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs podName:e5103329-ae63-4574-9dcc-140804f95f79 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:00.372085562 +0000 UTC m=+2.998762222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs") pod "network-metrics-daemon-jxwc7" (UID: "e5103329-ae63-4574-9dcc-140804f95f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-hostroot\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.872239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872160 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-host-slash\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-run-netns\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872186 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-kubelet\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872268 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysconfig\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-os-release\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872886 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-cni-multus\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-etc-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872968 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-host\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.872994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.872988 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-socket-dir-parent\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-socket-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-cni-bin\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-etc-kubernetes\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-host\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873133 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-system-cni-dir\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873162 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-etc-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-cni-bin\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873328 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-socket-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-etc-kubernetes\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.873505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873470 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/506082c4-3364-48e7-a27f-927f2729dde4-system-cni-dir\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.873951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-systemd-units\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.873951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-systemd-units\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.873951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873726 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-ovnkube-config\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.873951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873889 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-systemd\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.873951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-log-socket\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.873951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-systemd\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.874223 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873959 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-sys\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.874223 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.873982 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-tmp-dir\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.874223 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874002 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3250d03-5ab0-4acf-8145-601ce40b14a2-konnectivity-ca\") pod \"konnectivity-agent-djdfl\" (UID: \"c3250d03-5ab0-4acf-8145-601ce40b14a2\") " pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:38:59.874223 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874008 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-log-socket\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.874223 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874107 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-sys\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.874223 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-cni-binary-copy\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.874223 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-conf-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.874223 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-node-log\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-registration-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-tmp-dir\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c67kt\" (UniqueName: \"kubernetes.io/projected/6b281d59-c062-4407-95da-057a82e47cba-kube-api-access-c67kt\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7chth\" (UniqueName: \"kubernetes.io/projected/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-kube-api-access-7chth\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874423 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-node-log\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874430 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-system-cni-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-conf-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-system-cni-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.874576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-kubelet\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-cni-binary-copy\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874650 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-iptables-alerter-script\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874704 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-host\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-systemd\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874763 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874800 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-cni-binary-copy\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-cni-bin\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.874989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-env-overrides\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.875026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875017 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b281d59-c062-4407-95da-057a82e47cba-ovn-node-metrics-cert\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.875472 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-serviceca\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.875472 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkqpp\" (UniqueName: \"kubernetes.io/projected/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-kube-api-access-mkqpp\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.875472 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-run-ovn-kubernetes\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.875472 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-run-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.875472 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875300 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-kubernetes\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.875472 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-cni-bin\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.875472 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875376 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-cni-binary-copy\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-var-lib-kubelet\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875950 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-registration-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-systemd\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-host\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.875237 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-kubernetes\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876489 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-serviceca\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysctl-d\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876584 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-lib-modules\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/506082c4-3364-48e7-a27f-927f2729dde4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876614 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysctl-d\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876633 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-iptables-alerter-script\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xs6\" (UniqueName: \"kubernetes.io/projected/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-kube-api-access-q8xs6\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-slash\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876699 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-run-ovn-kubernetes\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-lib-modules\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.877386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-slash\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.876875 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-tmp\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-ovnkube-script-lib\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-env-overrides\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysctl-conf\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877184 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-run\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3250d03-5ab0-4acf-8145-601ce40b14a2-agent-certs\") pod \"konnectivity-agent-djdfl\" (UID: \"c3250d03-5ab0-4acf-8145-601ce40b14a2\") " pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877260 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-multus-certs\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-var-lib-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-var-lib-kubelet\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877350 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877394 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-hosts-file\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-cnibin\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lxc\" (UniqueName: \"kubernetes.io/projected/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-kube-api-access-l7lxc\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-cni-netd\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-device-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-modprobe-d\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-tuned\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877622 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggdpl\" (UniqueName: \"kubernetes.io/projected/65169960-4a16-4850-a073-5d9addbb46e9-kube-api-access-ggdpl\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-cni-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-netns\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-daemon-config\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877793 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b281d59-c062-4407-95da-057a82e47cba-ovnkube-script-lib\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877830 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-sysctl-conf\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877900 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-var-lib-openvswitch\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.877966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-cni-dir\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878024 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-netns\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878063 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-host-run-multus-certs\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-hosts-file\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-device-dir\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-run\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b281d59-c062-4407-95da-057a82e47cba-host-cni-netd\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878743 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-cnibin\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-modprobe-d\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.878952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.878943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-var-lib-kubelet\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.879735 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.879031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65169960-4a16-4850-a073-5d9addbb46e9-etc-selinux\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:38:59.879735 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.879578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-multus-daemon-config\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.880799 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.880774 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-etc-tuned\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.881577 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.881554 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b281d59-c062-4407-95da-057a82e47cba-ovn-node-metrics-cert\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.881696 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.881658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3250d03-5ab0-4acf-8145-601ce40b14a2-agent-certs\") pod \"konnectivity-agent-djdfl\" (UID: \"c3250d03-5ab0-4acf-8145-601ce40b14a2\") " pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:38:59.882490 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.882468 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4crl\" (UniqueName: \"kubernetes.io/projected/506082c4-3364-48e7-a27f-927f2729dde4-kube-api-access-h4crl\") pod \"multus-additional-cni-plugins-hvz8m\" (UID: \"506082c4-3364-48e7-a27f-927f2729dde4\") " pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:38:59.883859 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.883829 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:38:59.883859 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.883854 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:38:59.883999 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.883871 2579 projected.go:194] Error preparing data for projected volume kube-api-access-dbf4b for pod openshift-network-diagnostics/network-check-target-chcg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:59.883999 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:38:59.883936 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b podName:ca0a3fc9-06ad-4561-9392-21daefb76530 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:00.383918419 +0000 UTC m=+3.010595096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dbf4b" (UniqueName: "kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b") pod "network-check-target-chcg6" (UID: "ca0a3fc9-06ad-4561-9392-21daefb76530") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:59.885684 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.885661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px5qw\" (UniqueName: \"kubernetes.io/projected/4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e-kube-api-access-px5qw\") pod \"tuned-bg69d\" (UID: \"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e\") " pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:38:59.886189 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.886144 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7chth\" (UniqueName: \"kubernetes.io/projected/b54dd57a-4c1d-4f99-a559-3e4be3f7266f-kube-api-access-7chth\") pod \"node-resolver-db4v4\" (UID: \"b54dd57a-4c1d-4f99-a559-3e4be3f7266f\") " pod="openshift-dns/node-resolver-db4v4" Apr 21 04:38:59.886788 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.886746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lbj\" (UniqueName: \"kubernetes.io/projected/e5103329-ae63-4574-9dcc-140804f95f79-kube-api-access-26lbj\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:38:59.887829 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.887805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8xs6\" (UniqueName: \"kubernetes.io/projected/43feefbe-ff70-4e7a-8ad0-1791e41e4c6c-kube-api-access-q8xs6\") pod \"node-ca-5x8r5\" (UID: \"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c\") " pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:38:59.887958 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.887936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lxc\" (UniqueName: \"kubernetes.io/projected/feb386b7-4b7c-4ba5-9c98-fd202d27be4d-kube-api-access-l7lxc\") pod \"iptables-alerter-t58lb\" (UID: \"feb386b7-4b7c-4ba5-9c98-fd202d27be4d\") " pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:38:59.888062 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.888050 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkqpp\" (UniqueName: \"kubernetes.io/projected/e5cf2a49-609f-4790-abef-7cf1ee58cdbc-kube-api-access-mkqpp\") pod \"multus-wkps9\" (UID: \"e5cf2a49-609f-4790-abef-7cf1ee58cdbc\") " pod="openshift-multus/multus-wkps9" Apr 21 04:38:59.888539 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.888516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c67kt\" (UniqueName: \"kubernetes.io/projected/6b281d59-c062-4407-95da-057a82e47cba-kube-api-access-c67kt\") pod \"ovnkube-node-dv4qj\" (UID: \"6b281d59-c062-4407-95da-057a82e47cba\") " pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:38:59.888751 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:38:59.888728 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggdpl\" (UniqueName: \"kubernetes.io/projected/65169960-4a16-4850-a073-5d9addbb46e9-kube-api-access-ggdpl\") pod \"aws-ebs-csi-driver-node-dcx44\" (UID: \"65169960-4a16-4850-a073-5d9addbb46e9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:39:00.057715 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.057678 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wkps9" Apr 21 04:39:00.065560 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.065533 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5x8r5" Apr 21 04:39:00.074158 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.074133 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bg69d" Apr 21 04:39:00.078783 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.078762 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" Apr 21 04:39:00.085324 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.085299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t58lb" Apr 21 04:39:00.091926 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.091905 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:39:00.097533 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.097511 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:39:00.104197 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.104177 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" Apr 21 04:39:00.105452 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.105437 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:39:00.107449 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.107430 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-db4v4" Apr 21 04:39:00.380578 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.380541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:00.380735 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:00.380693 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:00.380795 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:00.380754 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs podName:e5103329-ae63-4574-9dcc-140804f95f79 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:01.380738551 +0000 UTC m=+4.007415207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs") pod "network-metrics-daemon-jxwc7" (UID: "e5103329-ae63-4574-9dcc-140804f95f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:00.460142 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:00.460100 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d4e2d6b_0b6a_44c6_8746_1accd65dcb5e.slice/crio-b39f01f05c6e1c0f3d3670a894a1739c6b7cf3f0a6db8909e99c7a90b8f4bf89 WatchSource:0}: Error finding container b39f01f05c6e1c0f3d3670a894a1739c6b7cf3f0a6db8909e99c7a90b8f4bf89: Status 404 returned error can't find the container with id b39f01f05c6e1c0f3d3670a894a1739c6b7cf3f0a6db8909e99c7a90b8f4bf89 Apr 21 04:39:00.461717 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:00.461393 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3250d03_5ab0_4acf_8145_601ce40b14a2.slice/crio-63a7d22fd4967c55a5210736ca925f9abda4a5b68ca616c34e5daad24a888077 WatchSource:0}: Error finding container 63a7d22fd4967c55a5210736ca925f9abda4a5b68ca616c34e5daad24a888077: Status 404 returned error can't find the container with id 63a7d22fd4967c55a5210736ca925f9abda4a5b68ca616c34e5daad24a888077 Apr 21 04:39:00.464473 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:00.464448 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b281d59_c062_4407_95da_057a82e47cba.slice/crio-7ec015e3a265482ce22e125d312229791f97fbc4d129f035dceb113c366106e4 WatchSource:0}: Error finding container 7ec015e3a265482ce22e125d312229791f97fbc4d129f035dceb113c366106e4: Status 404 returned error can't find the container with id 7ec015e3a265482ce22e125d312229791f97fbc4d129f035dceb113c366106e4 Apr 21 04:39:00.465184 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:00.465159 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65169960_4a16_4850_a073_5d9addbb46e9.slice/crio-73cff117a90360d5fde96b72fabf94f262bbccfd4eef3b69638a8f5f7efacbb6 WatchSource:0}: Error finding container 73cff117a90360d5fde96b72fabf94f262bbccfd4eef3b69638a8f5f7efacbb6: Status 404 returned error can't find the container with id 73cff117a90360d5fde96b72fabf94f262bbccfd4eef3b69638a8f5f7efacbb6 Apr 21 04:39:00.465756 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:00.465722 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43feefbe_ff70_4e7a_8ad0_1791e41e4c6c.slice/crio-b3593de2b0c99ec99ed27fb2852b3040b0b6fd8e539888d341615823e140f554 WatchSource:0}: Error finding container b3593de2b0c99ec99ed27fb2852b3040b0b6fd8e539888d341615823e140f554: Status 404 returned error can't find the container with id b3593de2b0c99ec99ed27fb2852b3040b0b6fd8e539888d341615823e140f554 Apr 21 04:39:00.467424 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:00.467403 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506082c4_3364_48e7_a27f_927f2729dde4.slice/crio-aad50885f527f786c2a57bcc50f94d0abe0474026cb28bb9fda1fff7c79a286f WatchSource:0}: Error finding container aad50885f527f786c2a57bcc50f94d0abe0474026cb28bb9fda1fff7c79a286f: Status 404 returned error can't find the container with id aad50885f527f786c2a57bcc50f94d0abe0474026cb28bb9fda1fff7c79a286f Apr 21 04:39:00.476005 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:00.475967 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5cf2a49_609f_4790_abef_7cf1ee58cdbc.slice/crio-e3809d35a70c9ff958ba0f1d49e2f8133c4db8718b069ff04e1a8a7f48232813 WatchSource:0}: Error finding container e3809d35a70c9ff958ba0f1d49e2f8133c4db8718b069ff04e1a8a7f48232813: Status 404 returned error can't find the container with id e3809d35a70c9ff958ba0f1d49e2f8133c4db8718b069ff04e1a8a7f48232813 Apr 21 04:39:00.481149 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.481120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:00.481319 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:00.481277 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:00.481319 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:00.481302 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:00.481319 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:00.481314 2579 projected.go:194] Error preparing data for projected volume kube-api-access-dbf4b for pod openshift-network-diagnostics/network-check-target-chcg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:00.481494 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:00.481394 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b podName:ca0a3fc9-06ad-4561-9392-21daefb76530 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:01.481353946 +0000 UTC m=+4.108030606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbf4b" (UniqueName: "kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b") pod "network-check-target-chcg6" (UID: "ca0a3fc9-06ad-4561-9392-21daefb76530") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:00.798078 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.797737 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:33:58 +0000 UTC" deadline="2028-01-26 06:42:35.490568694 +0000 UTC" Apr 21 04:39:00.798078 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.797976 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15482h3m34.692598091s" Apr 21 04:39:00.892045 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.892011 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:00.892226 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:00.892141 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:00.906256 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.906198 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" event={"ID":"506082c4-3364-48e7-a27f-927f2729dde4","Type":"ContainerStarted","Data":"aad50885f527f786c2a57bcc50f94d0abe0474026cb28bb9fda1fff7c79a286f"} Apr 21 04:39:00.912304 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.912243 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bg69d" event={"ID":"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e","Type":"ContainerStarted","Data":"b39f01f05c6e1c0f3d3670a894a1739c6b7cf3f0a6db8909e99c7a90b8f4bf89"} Apr 21 04:39:00.923807 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.923749 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5x8r5" event={"ID":"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c","Type":"ContainerStarted","Data":"b3593de2b0c99ec99ed27fb2852b3040b0b6fd8e539888d341615823e140f554"} Apr 21 04:39:00.926455 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.926410 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t58lb" event={"ID":"feb386b7-4b7c-4ba5-9c98-fd202d27be4d","Type":"ContainerStarted","Data":"085edd0184bad07b7996a457a1673f1537d8d76e2e719a287b5c689fadd44cd1"} Apr 21 04:39:00.937469 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.937417 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" event={"ID":"65169960-4a16-4850-a073-5d9addbb46e9","Type":"ContainerStarted","Data":"73cff117a90360d5fde96b72fabf94f262bbccfd4eef3b69638a8f5f7efacbb6"} Apr 21 04:39:00.943850 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.943798 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"7ec015e3a265482ce22e125d312229791f97fbc4d129f035dceb113c366106e4"} Apr 21 04:39:00.945305 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.945260 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-djdfl" event={"ID":"c3250d03-5ab0-4acf-8145-601ce40b14a2","Type":"ContainerStarted","Data":"63a7d22fd4967c55a5210736ca925f9abda4a5b68ca616c34e5daad24a888077"} Apr 21 04:39:00.954721 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.954696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" event={"ID":"22c8428fe5241945337c215cc12a9733","Type":"ContainerStarted","Data":"4ccf92a325eec371421adf5d59ff7faa1ab4d8d22cb8b598062fe4b0a3482b6b"} Apr 21 04:39:00.960630 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.960566 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wkps9" event={"ID":"e5cf2a49-609f-4790-abef-7cf1ee58cdbc","Type":"ContainerStarted","Data":"e3809d35a70c9ff958ba0f1d49e2f8133c4db8718b069ff04e1a8a7f48232813"} Apr 21 04:39:00.963849 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.963711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-db4v4" event={"ID":"b54dd57a-4c1d-4f99-a559-3e4be3f7266f","Type":"ContainerStarted","Data":"f56b165773513c56ba1f7f62ec63ac104a0c1c6eea8933ef10f28ad32e0e70c2"} Apr 21 04:39:00.968627 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:00.968581 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-122.ec2.internal" podStartSLOduration=1.968565289 podStartE2EDuration="1.968565289s" podCreationTimestamp="2026-04-21 04:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:39:00.968541878 +0000 UTC m=+3.595218558" watchObservedRunningTime="2026-04-21 04:39:00.968565289 +0000 UTC m=+3.595241970" Apr 21 04:39:01.390201 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:01.390126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:01.390406 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:01.390291 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:01.390406 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:01.390385 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs podName:e5103329-ae63-4574-9dcc-140804f95f79 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:03.390349094 +0000 UTC m=+6.017025756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs") pod "network-metrics-daemon-jxwc7" (UID: "e5103329-ae63-4574-9dcc-140804f95f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:01.491902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:01.491219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:01.491902 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:01.491437 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:01.491902 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:01.491458 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:01.491902 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:01.491470 2579 projected.go:194] Error preparing data for projected volume kube-api-access-dbf4b for pod openshift-network-diagnostics/network-check-target-chcg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:01.491902 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:01.491528 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b podName:ca0a3fc9-06ad-4561-9392-21daefb76530 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:03.491509498 +0000 UTC m=+6.118186171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbf4b" (UniqueName: "kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b") pod "network-check-target-chcg6" (UID: "ca0a3fc9-06ad-4561-9392-21daefb76530") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:01.892591 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:01.892556 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:01.893041 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:01.892698 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:01.990843 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:01.990757 2579 generic.go:358] "Generic (PLEG): container finished" podID="da09097ea442c891eed521830fa29838" containerID="c8c012ae8098fc1907b9e47ee1eb7660a2fb055ba8f2876f383962b0285f2bd7" exitCode=0 Apr 21 04:39:01.990992 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:01.990911 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" event={"ID":"da09097ea442c891eed521830fa29838","Type":"ContainerDied","Data":"c8c012ae8098fc1907b9e47ee1eb7660a2fb055ba8f2876f383962b0285f2bd7"} Apr 21 04:39:02.892546 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:02.892515 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:02.892734 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:02.892644 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:02.996271 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:02.996237 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" event={"ID":"da09097ea442c891eed521830fa29838","Type":"ContainerStarted","Data":"57ba6df07def5729c80b7c67ed77c44d0556294ab7b0a0a60ce5ed0e6040fe0d"} Apr 21 04:39:03.407795 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:03.407762 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:03.407979 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:03.407916 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:03.408039 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:03.407986 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs podName:e5103329-ae63-4574-9dcc-140804f95f79 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:07.40796628 +0000 UTC m=+10.034642938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs") pod "network-metrics-daemon-jxwc7" (UID: "e5103329-ae63-4574-9dcc-140804f95f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:03.508893 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:03.508276 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:03.508893 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:03.508475 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:03.508893 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:03.508493 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:03.508893 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:03.508506 2579 projected.go:194] Error preparing data for projected volume kube-api-access-dbf4b for pod openshift-network-diagnostics/network-check-target-chcg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:03.508893 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:03.508563 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b podName:ca0a3fc9-06ad-4561-9392-21daefb76530 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:07.508545171 +0000 UTC m=+10.135221830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbf4b" (UniqueName: "kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b") pod "network-check-target-chcg6" (UID: "ca0a3fc9-06ad-4561-9392-21daefb76530") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:03.892150 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:03.891968 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:03.892320 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:03.892152 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:04.892903 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:04.892514 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:04.892903 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:04.892643 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:05.893644 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:05.893593 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:05.894065 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:05.893748 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:05.968850 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:05.968202 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-122.ec2.internal" podStartSLOduration=6.968181538 podStartE2EDuration="6.968181538s" podCreationTimestamp="2026-04-21 04:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:39:03.012059893 +0000 UTC m=+5.638736571" watchObservedRunningTime="2026-04-21 04:39:05.968181538 +0000 UTC m=+8.594858218" Apr 21 04:39:05.969220 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:05.969128 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6rnkb"] Apr 21 04:39:05.971904 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:05.971882 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:05.972021 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:05.971956 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:06.029182 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.028936 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.029182 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.029025 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/533e48e5-7652-4081-aa24-2f0eaed21d14-dbus\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.029182 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.029079 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/533e48e5-7652-4081-aa24-2f0eaed21d14-kubelet-config\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.130311 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.129855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.130311 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.129916 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/533e48e5-7652-4081-aa24-2f0eaed21d14-dbus\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.130311 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.129969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/533e48e5-7652-4081-aa24-2f0eaed21d14-kubelet-config\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.130311 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:06.130005 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:06.130311 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.130059 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/533e48e5-7652-4081-aa24-2f0eaed21d14-kubelet-config\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.130311 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:06.130080 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret podName:533e48e5-7652-4081-aa24-2f0eaed21d14 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:06.630061121 +0000 UTC m=+9.256737789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret") pod "global-pull-secret-syncer-6rnkb" (UID: "533e48e5-7652-4081-aa24-2f0eaed21d14") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:06.130311 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.130142 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/533e48e5-7652-4081-aa24-2f0eaed21d14-dbus\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.634640 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.634592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:06.634833 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:06.634803 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:06.634890 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:06.634875 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret podName:533e48e5-7652-4081-aa24-2f0eaed21d14 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:07.634854508 +0000 UTC m=+10.261531165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret") pod "global-pull-secret-syncer-6rnkb" (UID: "533e48e5-7652-4081-aa24-2f0eaed21d14") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:06.892298 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:06.891947 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:06.892298 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:06.892088 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:07.442773 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:07.442729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:07.443270 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.442866 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:07.443270 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.442932 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs podName:e5103329-ae63-4574-9dcc-140804f95f79 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:15.442917939 +0000 UTC m=+18.069594594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs") pod "network-metrics-daemon-jxwc7" (UID: "e5103329-ae63-4574-9dcc-140804f95f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:07.543956 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:07.543915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:07.544129 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.544099 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:07.544200 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.544131 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:07.544200 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.544147 2579 projected.go:194] Error preparing data for projected volume kube-api-access-dbf4b for pod openshift-network-diagnostics/network-check-target-chcg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:07.544302 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.544212 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b podName:ca0a3fc9-06ad-4561-9392-21daefb76530 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:15.544192228 +0000 UTC m=+18.170868885 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbf4b" (UniqueName: "kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b") pod "network-check-target-chcg6" (UID: "ca0a3fc9-06ad-4561-9392-21daefb76530") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:07.645991 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:07.645375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:07.645991 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.645535 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:07.645991 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.645601 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret podName:533e48e5-7652-4081-aa24-2f0eaed21d14 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:09.645587149 +0000 UTC m=+12.272263805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret") pod "global-pull-secret-syncer-6rnkb" (UID: "533e48e5-7652-4081-aa24-2f0eaed21d14") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:07.895439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:07.895407 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:07.895602 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:07.895458 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:07.895602 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.895541 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:07.895704 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:07.895605 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:08.892425 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:08.892321 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:08.892911 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:08.892473 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:09.658581 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:09.658444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:09.658740 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:09.658602 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:09.658740 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:09.658672 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret podName:533e48e5-7652-4081-aa24-2f0eaed21d14 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:13.6586519 +0000 UTC m=+16.285328555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret") pod "global-pull-secret-syncer-6rnkb" (UID: "533e48e5-7652-4081-aa24-2f0eaed21d14") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:09.892479 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:09.892392 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:09.892917 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:09.892408 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:09.892917 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:09.892551 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:09.892917 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:09.892612 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:10.892891 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:10.892849 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:10.893315 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:10.892959 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:11.892293 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:11.892256 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:11.892293 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:11.892286 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:11.892546 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:11.892408 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:11.892606 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:11.892550 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:12.891929 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:12.891900 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:12.892334 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:12.891993 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:13.686714 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:13.686681 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:13.686928 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:13.686827 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:13.686928 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:13.686891 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret podName:533e48e5-7652-4081-aa24-2f0eaed21d14 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:21.686873286 +0000 UTC m=+24.313549947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret") pod "global-pull-secret-syncer-6rnkb" (UID: "533e48e5-7652-4081-aa24-2f0eaed21d14") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:13.892544 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:13.892512 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:13.892995 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:13.892639 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:13.892995 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:13.892698 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:13.892995 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:13.892819 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:14.892494 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:14.892453 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:14.892714 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:14.892579 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:15.500564 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:15.500523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:15.500737 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:15.500692 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:15.500814 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:15.500776 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs podName:e5103329-ae63-4574-9dcc-140804f95f79 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:31.500755932 +0000 UTC m=+34.127432588 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs") pod "network-metrics-daemon-jxwc7" (UID: "e5103329-ae63-4574-9dcc-140804f95f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:15.601571 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:15.601530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:15.601756 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:15.601670 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:15.601756 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:15.601686 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:15.601756 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:15.601699 2579 projected.go:194] Error preparing data for projected volume kube-api-access-dbf4b for pod openshift-network-diagnostics/network-check-target-chcg6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:15.601880 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:15.601760 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b podName:ca0a3fc9-06ad-4561-9392-21daefb76530 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:31.601744412 +0000 UTC m=+34.228421072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbf4b" (UniqueName: "kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b") pod "network-check-target-chcg6" (UID: "ca0a3fc9-06ad-4561-9392-21daefb76530") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:15.892404 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:15.892302 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:15.892563 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:15.892312 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:15.892563 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:15.892474 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:15.892563 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:15.892509 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:16.892286 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:16.892249 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:16.892709 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:16.892392 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:17.892803 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:17.892654 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:17.893436 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:17.892712 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:17.893436 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:17.892908 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:17.893436 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:17.892979 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:18.021481 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.021444 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-djdfl" event={"ID":"c3250d03-5ab0-4acf-8145-601ce40b14a2","Type":"ContainerStarted","Data":"0a523af776f0388b56918c0e25cde2727f8d6d2e01d71190d2217e08ea162372"} Apr 21 04:39:18.022807 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.022785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wkps9" event={"ID":"e5cf2a49-609f-4790-abef-7cf1ee58cdbc","Type":"ContainerStarted","Data":"e233af7316f6efbe89f465394f022c3a2f8823c49d5d0f9c7359f6bdb62c4e29"} Apr 21 04:39:18.023920 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.023902 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-db4v4" event={"ID":"b54dd57a-4c1d-4f99-a559-3e4be3f7266f","Type":"ContainerStarted","Data":"f1a400f3a5fa82704334e7d4fe282f1f4ee2b6509463abbcbbb1c02095ef5799"} Apr 21 04:39:18.025530 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.025502 2579 generic.go:358] "Generic (PLEG): container finished" podID="506082c4-3364-48e7-a27f-927f2729dde4" containerID="3d3001e222a399a8ecd86c57debea761ca83ac4ef6d8d8029b54e27f1dd30ea3" exitCode=0 Apr 21 04:39:18.025608 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.025577 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" event={"ID":"506082c4-3364-48e7-a27f-927f2729dde4","Type":"ContainerDied","Data":"3d3001e222a399a8ecd86c57debea761ca83ac4ef6d8d8029b54e27f1dd30ea3"} Apr 21 04:39:18.029359 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.029336 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bg69d" event={"ID":"4d4e2d6b-0b6a-44c6-8746-1accd65dcb5e","Type":"ContainerStarted","Data":"813139ecfda736d7e33fc733ecf7b8d6a80e40901f0dca8cff0e26dccefe4b18"} Apr 21 04:39:18.031269 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.031243 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5x8r5" event={"ID":"43feefbe-ff70-4e7a-8ad0-1791e41e4c6c","Type":"ContainerStarted","Data":"3a6d2218dd72e2c9bd8e6bda493f0dbba8c3779905199e94e4ecbedfe69f6c14"} Apr 21 04:39:18.032957 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.032934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" event={"ID":"65169960-4a16-4850-a073-5d9addbb46e9","Type":"ContainerStarted","Data":"a2dc8f6d654411b0b6ea1af3675854f8903d7f0a933c520b479bfe78e4d226a4"} Apr 21 04:39:18.035308 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.035288 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:39:18.035668 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.035642 2579 generic.go:358] "Generic (PLEG): container finished" podID="6b281d59-c062-4407-95da-057a82e47cba" containerID="f08b201c07c17e3fb89d8d37f0bec0187f99a69a359bea0d027f68ea23ce6828" exitCode=1 Apr 21 04:39:18.035750 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.035679 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"247fc78bf0361732e5a0fc95c7b83aba9a2ecfa42e842aca7b34aeed2be40f0c"} Apr 21 04:39:18.035750 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.035696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"85e7d3fc9dc48b15a8c17d4a907c87149d897dd6810f74237b4ab1ffe8c1cc8e"} Apr 21 04:39:18.035750 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.035706 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerDied","Data":"f08b201c07c17e3fb89d8d37f0bec0187f99a69a359bea0d027f68ea23ce6828"} Apr 21 04:39:18.035750 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.035720 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"3508e7599b767489f72512691c78da3b202d95ec16a65cc9a74e6335478f3842"} Apr 21 04:39:18.056327 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.054696 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-djdfl" podStartSLOduration=3.024926205 podStartE2EDuration="20.054678301s" podCreationTimestamp="2026-04-21 04:38:58 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.46328464 +0000 UTC m=+3.089961305" lastFinishedPulling="2026-04-21 04:39:17.493036728 +0000 UTC m=+20.119713401" observedRunningTime="2026-04-21 04:39:18.035757968 +0000 UTC m=+20.662434647" watchObservedRunningTime="2026-04-21 04:39:18.054678301 +0000 UTC m=+20.681354982" Apr 21 04:39:18.069475 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.069436 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5x8r5" podStartSLOduration=8.603858561 podStartE2EDuration="21.069423085s" podCreationTimestamp="2026-04-21 04:38:57 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.468299189 +0000 UTC m=+3.094975848" lastFinishedPulling="2026-04-21 04:39:12.933863715 +0000 UTC m=+15.560540372" observedRunningTime="2026-04-21 04:39:18.069229753 +0000 UTC m=+20.695906431" watchObservedRunningTime="2026-04-21 04:39:18.069423085 +0000 UTC m=+20.696099762" Apr 21 04:39:18.082934 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.082886 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-db4v4" podStartSLOduration=3.065574285 podStartE2EDuration="20.082873758s" podCreationTimestamp="2026-04-21 04:38:58 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.47172358 +0000 UTC m=+3.098400238" lastFinishedPulling="2026-04-21 04:39:17.489023026 +0000 UTC m=+20.115699711" observedRunningTime="2026-04-21 04:39:18.082227279 +0000 UTC m=+20.708903959" watchObservedRunningTime="2026-04-21 04:39:18.082873758 +0000 UTC m=+20.709550437" Apr 21 04:39:18.100348 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.100304 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wkps9" podStartSLOduration=3.90682729 podStartE2EDuration="21.100288061s" podCreationTimestamp="2026-04-21 04:38:57 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.478386779 +0000 UTC m=+3.105063445" lastFinishedPulling="2026-04-21 04:39:17.671847559 +0000 UTC m=+20.298524216" observedRunningTime="2026-04-21 04:39:18.100083308 +0000 UTC m=+20.726759986" watchObservedRunningTime="2026-04-21 04:39:18.100288061 +0000 UTC m=+20.726964739" Apr 21 04:39:18.118335 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.118294 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bg69d" podStartSLOduration=4.049680011 podStartE2EDuration="21.118282207s" podCreationTimestamp="2026-04-21 04:38:57 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.461899062 +0000 UTC m=+3.088575724" lastFinishedPulling="2026-04-21 04:39:17.530501245 +0000 UTC m=+20.157177920" observedRunningTime="2026-04-21 04:39:18.117919427 +0000 UTC m=+20.744596105" watchObservedRunningTime="2026-04-21 04:39:18.118282207 +0000 UTC m=+20.744958885" Apr 21 04:39:18.774217 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.774189 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 04:39:18.821547 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.821328 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T04:39:18.77421107Z","UUID":"710a513e-07d2-45ef-9aad-a8b0a8b1d360","Handler":null,"Name":"","Endpoint":""} Apr 21 04:39:18.824724 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.824694 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 04:39:18.824724 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.824729 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 04:39:18.892392 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:18.892353 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:18.892600 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:18.892499 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:19.039560 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.039522 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t58lb" event={"ID":"feb386b7-4b7c-4ba5-9c98-fd202d27be4d","Type":"ContainerStarted","Data":"1a76a48a3399ff91c0854bb446602c658b6d2b9d99a44617d1e07c1973fe738e"} Apr 21 04:39:19.041395 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.041351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" event={"ID":"65169960-4a16-4850-a073-5d9addbb46e9","Type":"ContainerStarted","Data":"743d381e19325c7107195871bc59524e496d75092275cb3f1195ae7290de6100"} Apr 21 04:39:19.044566 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.044537 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:39:19.045063 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.044993 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"81d2869f17edc79c50ad4cba86bcfbe7e4fe8777703c6ae54c4e799d4c8fdb42"} Apr 21 04:39:19.045063 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.045026 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"74d342a83219b6f726634eceb43d6805687adf4de0780817159fdc15738e55a6"} Apr 21 04:39:19.069455 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.069409 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t58lb" podStartSLOduration=5.083939844 podStartE2EDuration="22.069390641s" podCreationTimestamp="2026-04-21 04:38:57 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.47077684 +0000 UTC m=+3.097453511" lastFinishedPulling="2026-04-21 04:39:17.456227642 +0000 UTC m=+20.082904308" observedRunningTime="2026-04-21 04:39:19.069335583 +0000 UTC m=+21.696012262" watchObservedRunningTime="2026-04-21 04:39:19.069390641 +0000 UTC m=+21.696067645" Apr 21 04:39:19.805744 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.805566 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:39:19.806214 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.806198 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:39:19.891993 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.891960 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:19.891993 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:19.891996 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:19.892237 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:19.892094 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:19.892237 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:19.892191 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:20.049523 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:20.049487 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" event={"ID":"65169960-4a16-4850-a073-5d9addbb46e9","Type":"ContainerStarted","Data":"91a5577f07a583ba5b62d8f49a2fc0e3b26176bcf5e9ed5c8b1875cc591dba08"} Apr 21 04:39:20.050041 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:20.050022 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:39:20.050445 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:20.050430 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-djdfl" Apr 21 04:39:20.066210 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:20.066164 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dcx44" podStartSLOduration=2.889758161 podStartE2EDuration="22.066150317s" podCreationTimestamp="2026-04-21 04:38:58 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.468650684 +0000 UTC m=+3.095327341" lastFinishedPulling="2026-04-21 04:39:19.645042825 +0000 UTC m=+22.271719497" observedRunningTime="2026-04-21 04:39:20.065477525 +0000 UTC m=+22.692154203" watchObservedRunningTime="2026-04-21 04:39:20.066150317 +0000 UTC m=+22.692826994" Apr 21 04:39:20.892635 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:20.892602 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:20.892837 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:20.892720 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:21.054709 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:21.054678 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:39:21.055255 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:21.055065 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"d6642408f7bd3bcfd684f091a8904d557338554196b0697d2dfc08f877985128"} Apr 21 04:39:21.746767 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:21.746725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:21.746959 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:21.746893 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:21.747030 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:21.746971 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret podName:533e48e5-7652-4081-aa24-2f0eaed21d14 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:37.746950749 +0000 UTC m=+40.373627417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret") pod "global-pull-secret-syncer-6rnkb" (UID: "533e48e5-7652-4081-aa24-2f0eaed21d14") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:21.892299 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:21.892262 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:21.892299 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:21.892280 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:21.892542 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:21.892449 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:21.892592 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:21.892557 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:22.892724 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:22.892695 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:22.893212 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:22.892805 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:23.061758 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:23.061608 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:39:23.062668 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:23.062097 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"179b549f4224ffb41d9d68e6c84cbb755d51e2af49b212c7c4f65cadb8ad78eb"} Apr 21 04:39:23.062668 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:23.062356 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:39:23.062668 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:23.062560 2579 scope.go:117] "RemoveContainer" containerID="f08b201c07c17e3fb89d8d37f0bec0187f99a69a359bea0d027f68ea23ce6828" Apr 21 04:39:23.078515 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:23.078492 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:39:23.891872 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:23.891850 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:23.892002 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:23.891853 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:23.892002 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:23.891949 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:23.892111 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:23.892033 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:24.068321 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.068295 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:39:24.068708 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.068613 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" event={"ID":"6b281d59-c062-4407-95da-057a82e47cba","Type":"ContainerStarted","Data":"bc1177bdd7cf5c52deb2c6e1d468bdd22e12a35178fb2be61b9db43fad04312a"} Apr 21 04:39:24.068935 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.068914 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:39:24.069024 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.068950 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:39:24.070139 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.070107 2579 generic.go:358] "Generic (PLEG): container finished" podID="506082c4-3364-48e7-a27f-927f2729dde4" containerID="2ec99bbbad00a1a7d29254100b393e3dba7427666f59c95aa628fe7d321a8c54" exitCode=0 Apr 21 04:39:24.070226 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.070142 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" event={"ID":"506082c4-3364-48e7-a27f-927f2729dde4","Type":"ContainerDied","Data":"2ec99bbbad00a1a7d29254100b393e3dba7427666f59c95aa628fe7d321a8c54"} Apr 21 04:39:24.084593 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.084570 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:39:24.100740 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.100701 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" podStartSLOduration=9.975649084 podStartE2EDuration="27.100689029s" podCreationTimestamp="2026-04-21 04:38:57 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.46657465 +0000 UTC m=+3.093251305" lastFinishedPulling="2026-04-21 04:39:17.591614587 +0000 UTC m=+20.218291250" observedRunningTime="2026-04-21 04:39:24.098873188 +0000 UTC m=+26.725549866" watchObservedRunningTime="2026-04-21 04:39:24.100689029 +0000 UTC m=+26.727365707" Apr 21 04:39:24.891936 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.891912 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:24.892053 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:24.892024 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:24.907558 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.907325 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6rnkb"] Apr 21 04:39:24.907700 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.907674 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:24.907799 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:24.907780 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:24.909986 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.909967 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-chcg6"] Apr 21 04:39:24.910680 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.910661 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jxwc7"] Apr 21 04:39:24.910784 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:24.910776 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:24.910907 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:24.910884 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:25.073521 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:25.073493 2579 generic.go:358] "Generic (PLEG): container finished" podID="506082c4-3364-48e7-a27f-927f2729dde4" containerID="66a1eb00c883967026737a6c223c115c8e548f91f9df37e64b61182370dfd5f4" exitCode=0 Apr 21 04:39:25.073902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:25.073563 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" event={"ID":"506082c4-3364-48e7-a27f-927f2729dde4","Type":"ContainerDied","Data":"66a1eb00c883967026737a6c223c115c8e548f91f9df37e64b61182370dfd5f4"} Apr 21 04:39:25.073902 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:25.073591 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:25.073902 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:25.073675 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:26.077675 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:26.077643 2579 generic.go:358] "Generic (PLEG): container finished" podID="506082c4-3364-48e7-a27f-927f2729dde4" containerID="3ce3b061ca22a751bdc8da6025f02ae6aa75ca260b16fad230d0f25813f7e49e" exitCode=0 Apr 21 04:39:26.078019 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:26.077721 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" event={"ID":"506082c4-3364-48e7-a27f-927f2729dde4","Type":"ContainerDied","Data":"3ce3b061ca22a751bdc8da6025f02ae6aa75ca260b16fad230d0f25813f7e49e"} Apr 21 04:39:26.892477 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:26.892440 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:26.892477 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:26.892470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:26.892697 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:26.892583 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:26.892697 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:26.892579 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:26.892697 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:26.892645 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:26.892840 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:26.892777 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:28.892189 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:28.892152 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:28.893025 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:28.892152 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:28.893025 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:28.892285 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6rnkb" podUID="533e48e5-7652-4081-aa24-2f0eaed21d14" Apr 21 04:39:28.893025 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:28.892307 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:28.893025 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:28.892418 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-chcg6" podUID="ca0a3fc9-06ad-4561-9392-21daefb76530" Apr 21 04:39:28.893025 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:28.892503 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jxwc7" podUID="e5103329-ae63-4574-9dcc-140804f95f79" Apr 21 04:39:30.756752 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.756720 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-122.ec2.internal" event="NodeReady" Apr 21 04:39:30.757291 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.756904 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 04:39:30.801228 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.801191 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vk48w"] Apr 21 04:39:30.826713 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.826673 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-j8jtk"] Apr 21 04:39:30.826947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.826922 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:30.829951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.829851 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 04:39:30.829951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.829866 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 04:39:30.829951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.829931 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f9n52\"" Apr 21 04:39:30.841115 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.841093 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vk48w"] Apr 21 04:39:30.841115 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.841119 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-j8jtk"] Apr 21 04:39:30.841266 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.841229 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:30.844086 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.844063 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 04:39:30.844452 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.844398 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 04:39:30.844452 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.844440 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 04:39:30.844595 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.844440 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ljx4n\"" Apr 21 04:39:30.892469 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.892430 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:30.892659 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.892440 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:30.892659 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.892445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:30.895339 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.895317 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:39:30.895474 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.895345 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgxqw\"" Apr 21 04:39:30.895474 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.895318 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:39:30.895678 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.895658 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-979g8\"" Apr 21 04:39:30.895756 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.895690 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:39:30.895756 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.895696 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:39:30.911021 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.910980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/56593aaa-f779-4c98-94da-5b75ed6e9124-tmp-dir\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:30.911154 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.911129 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:30.911216 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.911186 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj4r5\" (UniqueName: \"kubernetes.io/projected/56593aaa-f779-4c98-94da-5b75ed6e9124-kube-api-access-tj4r5\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:30.911268 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:30.911240 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56593aaa-f779-4c98-94da-5b75ed6e9124-config-volume\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.011985 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.011892 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/56593aaa-f779-4c98-94da-5b75ed6e9124-tmp-dir\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.011985 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.011967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhcfj\" (UniqueName: \"kubernetes.io/projected/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-kube-api-access-jhcfj\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:31.012226 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.011996 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:31.012226 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.012154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.012226 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.012202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tj4r5\" (UniqueName: \"kubernetes.io/projected/56593aaa-f779-4c98-94da-5b75ed6e9124-kube-api-access-tj4r5\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.012348 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.012236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56593aaa-f779-4c98-94da-5b75ed6e9124-config-volume\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.012348 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.012274 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/56593aaa-f779-4c98-94da-5b75ed6e9124-tmp-dir\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.012348 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.012323 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:31.012478 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.012429 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls podName:56593aaa-f779-4c98-94da-5b75ed6e9124 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:31.512407541 +0000 UTC m=+34.139084198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls") pod "dns-default-vk48w" (UID: "56593aaa-f779-4c98-94da-5b75ed6e9124") : secret "dns-default-metrics-tls" not found Apr 21 04:39:31.012735 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.012716 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56593aaa-f779-4c98-94da-5b75ed6e9124-config-volume\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.025048 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.025017 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj4r5\" (UniqueName: \"kubernetes.io/projected/56593aaa-f779-4c98-94da-5b75ed6e9124-kube-api-access-tj4r5\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.113581 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.113547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:31.113752 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.113674 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhcfj\" (UniqueName: \"kubernetes.io/projected/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-kube-api-access-jhcfj\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:31.113752 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.113708 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:31.113841 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.113788 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert podName:f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:31.613768728 +0000 UTC m=+34.240445389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert") pod "ingress-canary-j8jtk" (UID: "f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2") : secret "canary-serving-cert" not found Apr 21 04:39:31.134921 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.134888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhcfj\" (UniqueName: \"kubernetes.io/projected/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-kube-api-access-jhcfj\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:31.518055 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.518008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:31.518317 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.518099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:39:31.518317 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.518179 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:31.518317 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.518273 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:39:31.518317 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.518280 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls podName:56593aaa-f779-4c98-94da-5b75ed6e9124 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:32.518256382 +0000 UTC m=+35.144933053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls") pod "dns-default-vk48w" (UID: "56593aaa-f779-4c98-94da-5b75ed6e9124") : secret "dns-default-metrics-tls" not found Apr 21 04:39:31.518317 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.518320 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs podName:e5103329-ae63-4574-9dcc-140804f95f79 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:03.518307104 +0000 UTC m=+66.144983759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs") pod "network-metrics-daemon-jxwc7" (UID: "e5103329-ae63-4574-9dcc-140804f95f79") : secret "metrics-daemon-secret" not found Apr 21 04:39:31.619161 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.619114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:31.619161 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.619172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:31.619444 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.619282 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:31.619444 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:31.619385 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert podName:f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:32.619346255 +0000 UTC m=+35.246022911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert") pod "ingress-canary-j8jtk" (UID: "f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2") : secret "canary-serving-cert" not found Apr 21 04:39:31.621650 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.621628 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbf4b\" (UniqueName: \"kubernetes.io/projected/ca0a3fc9-06ad-4561-9392-21daefb76530-kube-api-access-dbf4b\") pod \"network-check-target-chcg6\" (UID: \"ca0a3fc9-06ad-4561-9392-21daefb76530\") " pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:31.813525 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:31.813485 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:32.065312 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:32.065119 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-chcg6"] Apr 21 04:39:32.069279 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:32.069246 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca0a3fc9_06ad_4561_9392_21daefb76530.slice/crio-d170f0d77af48cf508e5e7039f716f670d7fdc50fbae1e7f04a4fef5ba5077b2 WatchSource:0}: Error finding container d170f0d77af48cf508e5e7039f716f670d7fdc50fbae1e7f04a4fef5ba5077b2: Status 404 returned error can't find the container with id d170f0d77af48cf508e5e7039f716f670d7fdc50fbae1e7f04a4fef5ba5077b2 Apr 21 04:39:32.090717 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:32.090691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-chcg6" event={"ID":"ca0a3fc9-06ad-4561-9392-21daefb76530","Type":"ContainerStarted","Data":"d170f0d77af48cf508e5e7039f716f670d7fdc50fbae1e7f04a4fef5ba5077b2"} Apr 21 04:39:32.526948 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:32.526863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:32.527091 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:32.527016 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:32.527130 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:32.527093 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls podName:56593aaa-f779-4c98-94da-5b75ed6e9124 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:34.527076848 +0000 UTC m=+37.153753503 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls") pod "dns-default-vk48w" (UID: "56593aaa-f779-4c98-94da-5b75ed6e9124") : secret "dns-default-metrics-tls" not found Apr 21 04:39:32.627593 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:32.627558 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:32.627753 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:32.627696 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:32.627799 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:32.627777 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert podName:f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:34.627755955 +0000 UTC m=+37.254432612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert") pod "ingress-canary-j8jtk" (UID: "f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2") : secret "canary-serving-cert" not found Apr 21 04:39:33.096210 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:33.096179 2579 generic.go:358] "Generic (PLEG): container finished" podID="506082c4-3364-48e7-a27f-927f2729dde4" containerID="10898bf172bd82a0d1db874d19f2d947982ed3c65f434747bd28e2646f488208" exitCode=0 Apr 21 04:39:33.096210 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:33.096216 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" event={"ID":"506082c4-3364-48e7-a27f-927f2729dde4","Type":"ContainerDied","Data":"10898bf172bd82a0d1db874d19f2d947982ed3c65f434747bd28e2646f488208"} Apr 21 04:39:34.101498 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:34.101466 2579 generic.go:358] "Generic (PLEG): container finished" podID="506082c4-3364-48e7-a27f-927f2729dde4" containerID="58fc805240db135bb1064480b0931d6802ebc15212f041211f768936ee6acae1" exitCode=0 Apr 21 04:39:34.101914 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:34.101530 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" event={"ID":"506082c4-3364-48e7-a27f-927f2729dde4","Type":"ContainerDied","Data":"58fc805240db135bb1064480b0931d6802ebc15212f041211f768936ee6acae1"} Apr 21 04:39:34.545126 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:34.544929 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:34.545307 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:34.545089 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:34.545307 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:34.545217 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls podName:56593aaa-f779-4c98-94da-5b75ed6e9124 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:38.545196242 +0000 UTC m=+41.171872898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls") pod "dns-default-vk48w" (UID: "56593aaa-f779-4c98-94da-5b75ed6e9124") : secret "dns-default-metrics-tls" not found Apr 21 04:39:34.645657 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:34.645613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:34.645835 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:34.645802 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:34.645926 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:34.645903 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert podName:f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:38.645878871 +0000 UTC m=+41.272555528 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert") pod "ingress-canary-j8jtk" (UID: "f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2") : secret "canary-serving-cert" not found Apr 21 04:39:35.108201 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:35.108165 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" event={"ID":"506082c4-3364-48e7-a27f-927f2729dde4","Type":"ContainerStarted","Data":"dd038d1a909f80bfa97a40fab0b6ab8f94a6e0392df126f3acf27c0e6e551a6a"} Apr 21 04:39:35.129839 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:35.129786 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hvz8m" podStartSLOduration=6.661229063 podStartE2EDuration="38.129771724s" podCreationTimestamp="2026-04-21 04:38:57 +0000 UTC" firstStartedPulling="2026-04-21 04:39:00.469485722 +0000 UTC m=+3.096162382" lastFinishedPulling="2026-04-21 04:39:31.93802837 +0000 UTC m=+34.564705043" observedRunningTime="2026-04-21 04:39:35.128444292 +0000 UTC m=+37.755120970" watchObservedRunningTime="2026-04-21 04:39:35.129771724 +0000 UTC m=+37.756448401" Apr 21 04:39:36.111440 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:36.111401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-chcg6" event={"ID":"ca0a3fc9-06ad-4561-9392-21daefb76530","Type":"ContainerStarted","Data":"7fb11d810bfc697b318f21f7f659bc334f8865e198c2624389378918048fbadc"} Apr 21 04:39:36.111922 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:36.111557 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:39:36.127334 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:36.127289 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-chcg6" podStartSLOduration=36.056785703 podStartE2EDuration="39.127276571s" podCreationTimestamp="2026-04-21 04:38:57 +0000 UTC" firstStartedPulling="2026-04-21 04:39:32.071332524 +0000 UTC m=+34.698009183" lastFinishedPulling="2026-04-21 04:39:35.141823384 +0000 UTC m=+37.768500051" observedRunningTime="2026-04-21 04:39:36.12597114 +0000 UTC m=+38.752647818" watchObservedRunningTime="2026-04-21 04:39:36.127276571 +0000 UTC m=+38.753953249" Apr 21 04:39:37.768134 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:37.768083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:37.772294 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:37.772260 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/533e48e5-7652-4081-aa24-2f0eaed21d14-original-pull-secret\") pod \"global-pull-secret-syncer-6rnkb\" (UID: \"533e48e5-7652-4081-aa24-2f0eaed21d14\") " pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:37.818662 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:37.818622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6rnkb" Apr 21 04:39:37.940948 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:37.940915 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6rnkb"] Apr 21 04:39:37.944349 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:39:37.944311 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533e48e5_7652_4081_aa24_2f0eaed21d14.slice/crio-4cece1611db828ca7a6d8abfe7736f9a20595d8f08ea44db78bb9829130e0b37 WatchSource:0}: Error finding container 4cece1611db828ca7a6d8abfe7736f9a20595d8f08ea44db78bb9829130e0b37: Status 404 returned error can't find the container with id 4cece1611db828ca7a6d8abfe7736f9a20595d8f08ea44db78bb9829130e0b37 Apr 21 04:39:38.116225 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:38.116186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6rnkb" event={"ID":"533e48e5-7652-4081-aa24-2f0eaed21d14","Type":"ContainerStarted","Data":"4cece1611db828ca7a6d8abfe7736f9a20595d8f08ea44db78bb9829130e0b37"} Apr 21 04:39:38.574107 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:38.574058 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:38.574275 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:38.574203 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:38.574325 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:38.574279 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls podName:56593aaa-f779-4c98-94da-5b75ed6e9124 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:46.574258686 +0000 UTC m=+49.200935341 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls") pod "dns-default-vk48w" (UID: "56593aaa-f779-4c98-94da-5b75ed6e9124") : secret "dns-default-metrics-tls" not found Apr 21 04:39:38.675271 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:38.675236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:38.675458 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:38.675410 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:38.675506 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:38.675472 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert podName:f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:46.675457607 +0000 UTC m=+49.302134262 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert") pod "ingress-canary-j8jtk" (UID: "f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2") : secret "canary-serving-cert" not found Apr 21 04:39:43.126529 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:43.126494 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6rnkb" event={"ID":"533e48e5-7652-4081-aa24-2f0eaed21d14","Type":"ContainerStarted","Data":"89f8eb3e894fdf993e1abe97e00826a00c45db01aa8bac6a23be96a964e7b4a3"} Apr 21 04:39:43.141437 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:43.141388 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6rnkb" podStartSLOduration=33.926467016 podStartE2EDuration="38.14135039s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:37.946088779 +0000 UTC m=+40.572765438" lastFinishedPulling="2026-04-21 04:39:42.160972153 +0000 UTC m=+44.787648812" observedRunningTime="2026-04-21 04:39:43.140771535 +0000 UTC m=+45.767448213" watchObservedRunningTime="2026-04-21 04:39:43.14135039 +0000 UTC m=+45.768027062" Apr 21 04:39:46.634635 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:46.634597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:39:46.635010 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:46.634757 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:46.635010 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:46.634827 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls podName:56593aaa-f779-4c98-94da-5b75ed6e9124 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:02.634807547 +0000 UTC m=+65.261484203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls") pod "dns-default-vk48w" (UID: "56593aaa-f779-4c98-94da-5b75ed6e9124") : secret "dns-default-metrics-tls" not found Apr 21 04:39:46.735491 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:46.735455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:39:46.735636 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:46.735561 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:46.735636 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:39:46.735634 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert podName:f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:02.735619618 +0000 UTC m=+65.362296273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert") pod "ingress-canary-j8jtk" (UID: "f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2") : secret "canary-serving-cert" not found Apr 21 04:39:56.088438 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:39:56.088408 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dv4qj" Apr 21 04:40:02.639194 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:02.639156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:40:02.639588 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:02.639267 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:40:02.639588 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:02.639318 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls podName:56593aaa-f779-4c98-94da-5b75ed6e9124 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:34.639304943 +0000 UTC m=+97.265981599 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls") pod "dns-default-vk48w" (UID: "56593aaa-f779-4c98-94da-5b75ed6e9124") : secret "dns-default-metrics-tls" not found Apr 21 04:40:02.739806 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:02.739769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:40:02.739979 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:02.739954 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:40:02.740056 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:02.740040 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert podName:f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:34.740016943 +0000 UTC m=+97.366693619 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert") pod "ingress-canary-j8jtk" (UID: "f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2") : secret "canary-serving-cert" not found Apr 21 04:40:03.546218 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:03.546176 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:40:03.546474 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:03.546335 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:40:03.546474 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:03.546436 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs podName:e5103329-ae63-4574-9dcc-140804f95f79 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:07.546418855 +0000 UTC m=+130.173095531 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs") pod "network-metrics-daemon-jxwc7" (UID: "e5103329-ae63-4574-9dcc-140804f95f79") : secret "metrics-daemon-secret" not found Apr 21 04:40:07.115862 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:07.115834 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-chcg6" Apr 21 04:40:34.654926 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:34.654863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:40:34.655351 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:34.655025 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:40:34.655351 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:34.655095 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls podName:56593aaa-f779-4c98-94da-5b75ed6e9124 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:38.655075653 +0000 UTC m=+161.281752311 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls") pod "dns-default-vk48w" (UID: "56593aaa-f779-4c98-94da-5b75ed6e9124") : secret "dns-default-metrics-tls" not found Apr 21 04:40:34.755237 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:34.755141 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:40:34.755394 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:34.755290 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:40:34.755394 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:34.755378 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert podName:f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:38.755345338 +0000 UTC m=+161.382021999 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert") pod "ingress-canary-j8jtk" (UID: "f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2") : secret "canary-serving-cert" not found Apr 21 04:40:38.175530 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.175495 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6"] Apr 21 04:40:38.179567 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.179549 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6" Apr 21 04:40:38.182771 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.182750 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 04:40:38.183894 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.183874 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:40:38.183894 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.183888 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-txb4w\"" Apr 21 04:40:38.184705 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.184686 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6"] Apr 21 04:40:38.279594 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.279556 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwp85\" (UniqueName: \"kubernetes.io/projected/3848cc69-2658-432a-9bc4-45e27bb60167-kube-api-access-pwp85\") pod \"volume-data-source-validator-7c6cbb6c87-pv9h6\" (UID: \"3848cc69-2658-432a-9bc4-45e27bb60167\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6" Apr 21 04:40:38.380207 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.380146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwp85\" (UniqueName: \"kubernetes.io/projected/3848cc69-2658-432a-9bc4-45e27bb60167-kube-api-access-pwp85\") pod \"volume-data-source-validator-7c6cbb6c87-pv9h6\" (UID: \"3848cc69-2658-432a-9bc4-45e27bb60167\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6" Apr 21 04:40:38.388778 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.388740 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwp85\" (UniqueName: \"kubernetes.io/projected/3848cc69-2658-432a-9bc4-45e27bb60167-kube-api-access-pwp85\") pod \"volume-data-source-validator-7c6cbb6c87-pv9h6\" (UID: \"3848cc69-2658-432a-9bc4-45e27bb60167\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6" Apr 21 04:40:38.489684 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.489577 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6" Apr 21 04:40:38.603415 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:38.603383 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6"] Apr 21 04:40:38.606760 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:40:38.606725 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3848cc69_2658_432a_9bc4_45e27bb60167.slice/crio-873e6c52354f5a8fc83c452528f3583cea07147ad4b9e3d77aec4d9a202d2ebd WatchSource:0}: Error finding container 873e6c52354f5a8fc83c452528f3583cea07147ad4b9e3d77aec4d9a202d2ebd: Status 404 returned error can't find the container with id 873e6c52354f5a8fc83c452528f3583cea07147ad4b9e3d77aec4d9a202d2ebd Apr 21 04:40:39.237644 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:39.237610 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6" event={"ID":"3848cc69-2658-432a-9bc4-45e27bb60167","Type":"ContainerStarted","Data":"873e6c52354f5a8fc83c452528f3583cea07147ad4b9e3d77aec4d9a202d2ebd"} Apr 21 04:40:40.240720 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.240629 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6" event={"ID":"3848cc69-2658-432a-9bc4-45e27bb60167","Type":"ContainerStarted","Data":"7b1315c66c7bd90ca51633f1648780875743d962a3fe8d85afe17eef626d61a2"} Apr 21 04:40:40.255865 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.255814 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pv9h6" podStartSLOduration=1.044867193 podStartE2EDuration="2.255798828s" podCreationTimestamp="2026-04-21 04:40:38 +0000 UTC" firstStartedPulling="2026-04-21 04:40:38.608448537 +0000 UTC m=+101.235125193" lastFinishedPulling="2026-04-21 04:40:39.819380152 +0000 UTC m=+102.446056828" observedRunningTime="2026-04-21 04:40:40.254750981 +0000 UTC m=+102.881427656" watchObservedRunningTime="2026-04-21 04:40:40.255798828 +0000 UTC m=+102.882475505" Apr 21 04:40:40.711712 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.711680 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56fc55f57c-qwxvv"] Apr 21 04:40:40.714674 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.714656 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.717169 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.717150 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 04:40:40.717310 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.717298 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nlhzm\"" Apr 21 04:40:40.717356 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.717298 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 04:40:40.717528 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.717512 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 04:40:40.722303 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.722286 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 04:40:40.727329 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.727307 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56fc55f57c-qwxvv"] Apr 21 04:40:40.797740 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.797703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-ca-trust-extracted\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.797940 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.797748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-installation-pull-secrets\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.797940 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.797811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.797940 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.797831 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-trusted-ca\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.797940 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.797883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-certificates\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.797940 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.797927 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-bound-sa-token\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.798121 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.797976 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-image-registry-private-configuration\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.798121 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.797993 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqt2r\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-kube-api-access-fqt2r\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.898928 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.898893 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899086 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.898939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-trusted-ca\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899086 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.898968 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-certificates\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899086 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.899002 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-bound-sa-token\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899086 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:40.899029 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:40:40.899086 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:40.899056 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56fc55f57c-qwxvv: secret "image-registry-tls" not found Apr 21 04:40:40.899086 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.899058 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-image-registry-private-configuration\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899292 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:40.899118 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls podName:e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:41.399096129 +0000 UTC m=+104.025772785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls") pod "image-registry-56fc55f57c-qwxvv" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e") : secret "image-registry-tls" not found Apr 21 04:40:40.899292 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.899183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqt2r\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-kube-api-access-fqt2r\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899292 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.899265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-ca-trust-extracted\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899472 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.899297 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-installation-pull-secrets\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899726 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.899695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-ca-trust-extracted\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899843 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.899756 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-certificates\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.899947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.899930 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-trusted-ca\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.901405 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.901384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-image-registry-private-configuration\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.901586 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.901568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-installation-pull-secrets\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.907718 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.907695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-bound-sa-token\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:40.907872 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:40.907855 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqt2r\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-kube-api-access-fqt2r\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:41.402678 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:41.402628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:41.403062 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:41.402777 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:40:41.403062 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:41.402797 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56fc55f57c-qwxvv: secret "image-registry-tls" not found Apr 21 04:40:41.403062 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:41.402877 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls podName:e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:42.402860013 +0000 UTC m=+105.029536674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls") pod "image-registry-56fc55f57c-qwxvv" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e") : secret "image-registry-tls" not found Apr 21 04:40:42.409682 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:42.409628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:42.410079 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:42.409757 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:40:42.410079 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:42.409769 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56fc55f57c-qwxvv: secret "image-registry-tls" not found Apr 21 04:40:42.410079 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:42.409827 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls podName:e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:44.409814439 +0000 UTC m=+107.036491095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls") pod "image-registry-56fc55f57c-qwxvv" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e") : secret "image-registry-tls" not found Apr 21 04:40:43.183479 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.183446 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm"] Apr 21 04:40:43.187375 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.187349 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.189861 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.189837 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:40:43.189861 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.189857 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-t8kxm\"" Apr 21 04:40:43.190053 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.189862 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 04:40:43.190952 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.190931 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 04:40:43.191128 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.190949 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 04:40:43.195928 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.195906 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm"] Apr 21 04:40:43.318708 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.318666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ec7bfa-4b0d-470f-912f-87600811562b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.318883 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.318721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8mh\" (UniqueName: \"kubernetes.io/projected/30ec7bfa-4b0d-470f-912f-87600811562b-kube-api-access-bf8mh\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.318883 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.318841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ec7bfa-4b0d-470f-912f-87600811562b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.419932 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.419890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8mh\" (UniqueName: \"kubernetes.io/projected/30ec7bfa-4b0d-470f-912f-87600811562b-kube-api-access-bf8mh\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.420308 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.419991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ec7bfa-4b0d-470f-912f-87600811562b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.420308 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.420043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ec7bfa-4b0d-470f-912f-87600811562b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.420642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.420619 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ec7bfa-4b0d-470f-912f-87600811562b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.422155 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.422137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ec7bfa-4b0d-470f-912f-87600811562b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.427524 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.427504 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8mh\" (UniqueName: \"kubernetes.io/projected/30ec7bfa-4b0d-470f-912f-87600811562b-kube-api-access-bf8mh\") pod \"kube-storage-version-migrator-operator-6769c5d45-s5bdm\" (UID: \"30ec7bfa-4b0d-470f-912f-87600811562b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.497937 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.497836 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" Apr 21 04:40:43.611462 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.611411 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm"] Apr 21 04:40:43.615278 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:40:43.615239 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30ec7bfa_4b0d_470f_912f_87600811562b.slice/crio-dda60c761617f2610126ca1899fcd978a4be17baea55019e9a4a320fd0205372 WatchSource:0}: Error finding container dda60c761617f2610126ca1899fcd978a4be17baea55019e9a4a320fd0205372: Status 404 returned error can't find the container with id dda60c761617f2610126ca1899fcd978a4be17baea55019e9a4a320fd0205372 Apr 21 04:40:43.983597 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:43.983567 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-db4v4_b54dd57a-4c1d-4f99-a559-3e4be3f7266f/dns-node-resolver/0.log" Apr 21 04:40:44.250063 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:44.249970 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" event={"ID":"30ec7bfa-4b0d-470f-912f-87600811562b","Type":"ContainerStarted","Data":"dda60c761617f2610126ca1899fcd978a4be17baea55019e9a4a320fd0205372"} Apr 21 04:40:44.426505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:44.426452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:44.426937 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:44.426624 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:40:44.426937 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:44.426645 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56fc55f57c-qwxvv: secret "image-registry-tls" not found Apr 21 04:40:44.426937 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:44.426700 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls podName:e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:48.426684168 +0000 UTC m=+111.053360824 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls") pod "image-registry-56fc55f57c-qwxvv" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e") : secret "image-registry-tls" not found Apr 21 04:40:44.983749 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:44.983719 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5x8r5_43feefbe-ff70-4e7a-8ad0-1791e41e4c6c/node-ca/0.log" Apr 21 04:40:45.173037 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.172995 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm"] Apr 21 04:40:45.175937 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.175915 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.178535 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.178508 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 04:40:45.178668 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.178531 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 04:40:45.179753 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.179732 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 04:40:45.179853 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.179733 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-w4fnn\"" Apr 21 04:40:45.179853 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.179733 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:40:45.182917 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.182884 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm"] Apr 21 04:40:45.231736 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.231702 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c006d753-e048-4aef-a851-7a8ec3111def-config\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.231879 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.231768 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c006d753-e048-4aef-a851-7a8ec3111def-serving-cert\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.231879 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.231819 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4k4r\" (UniqueName: \"kubernetes.io/projected/c006d753-e048-4aef-a851-7a8ec3111def-kube-api-access-k4k4r\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.333025 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.332995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c006d753-e048-4aef-a851-7a8ec3111def-serving-cert\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.333185 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.333049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4k4r\" (UniqueName: \"kubernetes.io/projected/c006d753-e048-4aef-a851-7a8ec3111def-kube-api-access-k4k4r\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.333185 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.333094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c006d753-e048-4aef-a851-7a8ec3111def-config\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.333638 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.333608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c006d753-e048-4aef-a851-7a8ec3111def-config\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.335376 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.335348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c006d753-e048-4aef-a851-7a8ec3111def-serving-cert\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.340563 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.340539 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4k4r\" (UniqueName: \"kubernetes.io/projected/c006d753-e048-4aef-a851-7a8ec3111def-kube-api-access-k4k4r\") pod \"service-ca-operator-d6fc45fc5-phgkm\" (UID: \"c006d753-e048-4aef-a851-7a8ec3111def\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.486017 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.485976 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" Apr 21 04:40:45.953033 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:45.953006 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm"] Apr 21 04:40:45.956295 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:40:45.956270 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc006d753_e048_4aef_a851_7a8ec3111def.slice/crio-9810a92fc86ac08b16fc11fd034489696b8d69094e696ae9a254a5e5403f31fd WatchSource:0}: Error finding container 9810a92fc86ac08b16fc11fd034489696b8d69094e696ae9a254a5e5403f31fd: Status 404 returned error can't find the container with id 9810a92fc86ac08b16fc11fd034489696b8d69094e696ae9a254a5e5403f31fd Apr 21 04:40:46.255626 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:46.255536 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" event={"ID":"30ec7bfa-4b0d-470f-912f-87600811562b","Type":"ContainerStarted","Data":"43c5ba5665d89e73cd69eb89a746e145b5e97699e2fd55a15e97298991e24ba1"} Apr 21 04:40:46.256541 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:46.256515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" event={"ID":"c006d753-e048-4aef-a851-7a8ec3111def","Type":"ContainerStarted","Data":"9810a92fc86ac08b16fc11fd034489696b8d69094e696ae9a254a5e5403f31fd"} Apr 21 04:40:46.270232 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:46.270170 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" podStartSLOduration=1.000561148 podStartE2EDuration="3.270153324s" podCreationTimestamp="2026-04-21 04:40:43 +0000 UTC" firstStartedPulling="2026-04-21 04:40:43.617031729 +0000 UTC m=+106.243708386" lastFinishedPulling="2026-04-21 04:40:45.886623904 +0000 UTC m=+108.513300562" observedRunningTime="2026-04-21 04:40:46.269707778 +0000 UTC m=+108.896384453" watchObservedRunningTime="2026-04-21 04:40:46.270153324 +0000 UTC m=+108.896830005" Apr 21 04:40:48.262132 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:48.262093 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" event={"ID":"c006d753-e048-4aef-a851-7a8ec3111def","Type":"ContainerStarted","Data":"6691ac454d40d93a3ac0e68ea4f2a86938470547f514182873bbbc04c0e790cc"} Apr 21 04:40:48.277455 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:48.277411 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" podStartSLOduration=1.567572985 podStartE2EDuration="3.277396017s" podCreationTimestamp="2026-04-21 04:40:45 +0000 UTC" firstStartedPulling="2026-04-21 04:40:45.958129654 +0000 UTC m=+108.584806310" lastFinishedPulling="2026-04-21 04:40:47.667952683 +0000 UTC m=+110.294629342" observedRunningTime="2026-04-21 04:40:48.276968865 +0000 UTC m=+110.903645543" watchObservedRunningTime="2026-04-21 04:40:48.277396017 +0000 UTC m=+110.904072688" Apr 21 04:40:48.460884 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:48.460842 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:48.461039 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:48.460961 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:40:48.461039 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:48.460973 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56fc55f57c-qwxvv: secret "image-registry-tls" not found Apr 21 04:40:48.461039 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:48.461022 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls podName:e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:56.461008944 +0000 UTC m=+119.087685600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls") pod "image-registry-56fc55f57c-qwxvv" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e") : secret "image-registry-tls" not found Apr 21 04:40:51.057312 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.057279 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-f54r6"] Apr 21 04:40:51.060210 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.060193 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.062669 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.062645 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 04:40:51.062773 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.062728 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 04:40:51.063816 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.063797 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 04:40:51.063864 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.063841 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hhbhd\"" Apr 21 04:40:51.063864 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.063857 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 04:40:51.069412 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.069389 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-f54r6"] Apr 21 04:40:51.182818 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.182773 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-signing-key\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.183003 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.182842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729ph\" (UniqueName: \"kubernetes.io/projected/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-kube-api-access-729ph\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.183003 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.182879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-signing-cabundle\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.284215 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.284159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-signing-key\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.284443 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.284247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-729ph\" (UniqueName: \"kubernetes.io/projected/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-kube-api-access-729ph\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.284443 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.284288 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-signing-cabundle\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.285318 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.285289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-signing-cabundle\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.286588 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.286570 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-signing-key\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.295985 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.295958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-729ph\" (UniqueName: \"kubernetes.io/projected/ce7c6ed6-460f-4f63-97ea-c335cb5b11f4-kube-api-access-729ph\") pod \"service-ca-865cb79987-f54r6\" (UID: \"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4\") " pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.369357 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.369255 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-f54r6" Apr 21 04:40:51.485477 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:51.485429 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-f54r6"] Apr 21 04:40:51.488203 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:40:51.488159 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce7c6ed6_460f_4f63_97ea_c335cb5b11f4.slice/crio-005ed0e8967fd34150311bd80ffa531009a056e88322bb71815f469384b184d4 WatchSource:0}: Error finding container 005ed0e8967fd34150311bd80ffa531009a056e88322bb71815f469384b184d4: Status 404 returned error can't find the container with id 005ed0e8967fd34150311bd80ffa531009a056e88322bb71815f469384b184d4 Apr 21 04:40:52.272892 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:52.272850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-f54r6" event={"ID":"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4","Type":"ContainerStarted","Data":"fa2086b3cb210fa7b657aa6ffff0f9c0589b75057fd1bc9e37439c135644cec1"} Apr 21 04:40:52.272892 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:52.272889 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-f54r6" event={"ID":"ce7c6ed6-460f-4f63-97ea-c335cb5b11f4","Type":"ContainerStarted","Data":"005ed0e8967fd34150311bd80ffa531009a056e88322bb71815f469384b184d4"} Apr 21 04:40:52.288329 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:52.288260 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-f54r6" podStartSLOduration=1.288242038 podStartE2EDuration="1.288242038s" podCreationTimestamp="2026-04-21 04:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:40:52.287646824 +0000 UTC m=+114.914323504" watchObservedRunningTime="2026-04-21 04:40:52.288242038 +0000 UTC m=+114.914918717" Apr 21 04:40:56.523546 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:40:56.523500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:40:56.524033 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:56.523645 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:40:56.524033 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:56.523666 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56fc55f57c-qwxvv: secret "image-registry-tls" not found Apr 21 04:40:56.524033 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:40:56.523725 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls podName:e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e nodeName:}" failed. No retries permitted until 2026-04-21 04:41:12.523708421 +0000 UTC m=+135.150385089 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls") pod "image-registry-56fc55f57c-qwxvv" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e") : secret "image-registry-tls" not found Apr 21 04:41:07.605626 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:07.605572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:41:07.607927 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:07.607901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5103329-ae63-4574-9dcc-140804f95f79-metrics-certs\") pod \"network-metrics-daemon-jxwc7\" (UID: \"e5103329-ae63-4574-9dcc-140804f95f79\") " pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:41:07.806320 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:07.806285 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-979g8\"" Apr 21 04:41:07.813690 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:07.813653 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jxwc7" Apr 21 04:41:07.932058 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:07.932027 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jxwc7"] Apr 21 04:41:07.935217 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:07.935182 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5103329_ae63_4574_9dcc_140804f95f79.slice/crio-9fa8177208402f25211a87ad0da02363404d3d5a28a13edf828832275a9f1f3a WatchSource:0}: Error finding container 9fa8177208402f25211a87ad0da02363404d3d5a28a13edf828832275a9f1f3a: Status 404 returned error can't find the container with id 9fa8177208402f25211a87ad0da02363404d3d5a28a13edf828832275a9f1f3a Apr 21 04:41:08.313145 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:08.313104 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jxwc7" event={"ID":"e5103329-ae63-4574-9dcc-140804f95f79","Type":"ContainerStarted","Data":"9fa8177208402f25211a87ad0da02363404d3d5a28a13edf828832275a9f1f3a"} Apr 21 04:41:09.317412 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:09.317357 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jxwc7" event={"ID":"e5103329-ae63-4574-9dcc-140804f95f79","Type":"ContainerStarted","Data":"1a5833aabc108769d46a6374297da9e6bfd4f5efcebd3b936ddc3a3de35e2e76"} Apr 21 04:41:09.317797 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:09.317417 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jxwc7" event={"ID":"e5103329-ae63-4574-9dcc-140804f95f79","Type":"ContainerStarted","Data":"063eb6e805359d7df3b1790d1cb62eb58057d76d0a8185a7c856b01fbf461dcc"} Apr 21 04:41:09.333706 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:09.333656 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jxwc7" podStartSLOduration=131.335201344 podStartE2EDuration="2m12.333643979s" podCreationTimestamp="2026-04-21 04:38:57 +0000 UTC" firstStartedPulling="2026-04-21 04:41:07.937512476 +0000 UTC m=+130.564189133" lastFinishedPulling="2026-04-21 04:41:08.935955102 +0000 UTC m=+131.562631768" observedRunningTime="2026-04-21 04:41:09.332039815 +0000 UTC m=+131.958716494" watchObservedRunningTime="2026-04-21 04:41:09.333643979 +0000 UTC m=+131.960320656" Apr 21 04:41:12.092290 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.092252 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zwvsk"] Apr 21 04:41:12.095229 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.095211 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.099482 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.099459 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 04:41:12.100149 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.100125 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 04:41:12.100406 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.100385 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:41:12.100671 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.100656 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-x7jkx\"" Apr 21 04:41:12.101682 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.101663 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:41:12.107166 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.107142 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zwvsk"] Apr 21 04:41:12.163873 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.163841 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56fc55f57c-qwxvv"] Apr 21 04:41:12.164037 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:41:12.164020 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" podUID="e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" Apr 21 04:41:12.195973 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.195942 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vx96n"] Apr 21 04:41:12.199307 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.199289 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" Apr 21 04:41:12.201728 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.201706 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 04:41:12.201841 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.201708 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pvwjh\"" Apr 21 04:41:12.201841 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.201764 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 04:41:12.207789 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.207763 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vx96n"] Apr 21 04:41:12.238347 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.238317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9c8\" (UniqueName: \"kubernetes.io/projected/1202724e-b9b9-4a9b-893c-a0fd11838120-kube-api-access-vf9c8\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.238529 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.238403 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1202724e-b9b9-4a9b-893c-a0fd11838120-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.238529 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.238426 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1202724e-b9b9-4a9b-893c-a0fd11838120-crio-socket\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.238529 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.238455 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1202724e-b9b9-4a9b-893c-a0fd11838120-data-volume\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.238529 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.238497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1202724e-b9b9-4a9b-893c-a0fd11838120-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.309823 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.309783 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66898969c-9dljf"] Apr 21 04:41:12.313184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.313165 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.326333 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.326307 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66898969c-9dljf"] Apr 21 04:41:12.326657 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.326640 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:41:12.330792 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.330772 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:41:12.339163 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1202724e-b9b9-4a9b-893c-a0fd11838120-data-volume\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.339273 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339176 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1202724e-b9b9-4a9b-893c-a0fd11838120-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.339328 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339249 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5889b438-5ad7-4587-ad98-78b9ed6b52a5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vx96n\" (UID: \"5889b438-5ad7-4587-ad98-78b9ed6b52a5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" Apr 21 04:41:12.339400 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339372 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9c8\" (UniqueName: \"kubernetes.io/projected/1202724e-b9b9-4a9b-893c-a0fd11838120-kube-api-access-vf9c8\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.339455 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5889b438-5ad7-4587-ad98-78b9ed6b52a5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vx96n\" (UID: \"5889b438-5ad7-4587-ad98-78b9ed6b52a5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" Apr 21 04:41:12.339506 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1202724e-b9b9-4a9b-893c-a0fd11838120-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.339559 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1202724e-b9b9-4a9b-893c-a0fd11838120-crio-socket\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.339559 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339531 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1202724e-b9b9-4a9b-893c-a0fd11838120-data-volume\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.339651 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.339595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1202724e-b9b9-4a9b-893c-a0fd11838120-crio-socket\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.340241 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.340188 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1202724e-b9b9-4a9b-893c-a0fd11838120-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.342023 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.342006 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1202724e-b9b9-4a9b-893c-a0fd11838120-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.347491 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.347442 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9c8\" (UniqueName: \"kubernetes.io/projected/1202724e-b9b9-4a9b-893c-a0fd11838120-kube-api-access-vf9c8\") pod \"insights-runtime-extractor-zwvsk\" (UID: \"1202724e-b9b9-4a9b-893c-a0fd11838120\") " pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.404605 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.404569 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zwvsk" Apr 21 04:41:12.440933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440009 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-trusted-ca\") pod \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " Apr 21 04:41:12.440933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440578 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-ca-trust-extracted\") pod \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " Apr 21 04:41:12.440933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440629 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-bound-sa-token\") pod \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " Apr 21 04:41:12.440933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440655 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqt2r\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-kube-api-access-fqt2r\") pod \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " Apr 21 04:41:12.440933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440730 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-image-registry-private-configuration\") pod \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " Apr 21 04:41:12.440933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440858 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-installation-pull-secrets\") pod \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " Apr 21 04:41:12.440933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440884 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-certificates\") pod \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5889b438-5ad7-4587-ad98-78b9ed6b52a5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vx96n\" (UID: \"5889b438-5ad7-4587-ad98-78b9ed6b52a5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.440500 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-trusted-ca\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441082 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgm8j\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-kube-api-access-xgm8j\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-bound-sa-token\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441141 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-image-registry-private-configuration\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-registry-certificates\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-registry-tls\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-installation-pull-secrets\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441074 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441278 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5889b438-5ad7-4587-ad98-78b9ed6b52a5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vx96n\" (UID: \"5889b438-5ad7-4587-ad98-78b9ed6b52a5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-ca-trust-extracted\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441413 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-trusted-ca\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:41:12.441489 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441434 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-ca-trust-extracted\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:41:12.442161 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.441844 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:41:12.442161 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.442057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5889b438-5ad7-4587-ad98-78b9ed6b52a5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vx96n\" (UID: \"5889b438-5ad7-4587-ad98-78b9ed6b52a5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" Apr 21 04:41:12.443532 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.443502 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-kube-api-access-fqt2r" (OuterVolumeSpecName: "kube-api-access-fqt2r") pod "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e"). InnerVolumeSpecName "kube-api-access-fqt2r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:41:12.443655 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.443562 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:41:12.443655 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.443617 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:41:12.443900 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.443879 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5889b438-5ad7-4587-ad98-78b9ed6b52a5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vx96n\" (UID: \"5889b438-5ad7-4587-ad98-78b9ed6b52a5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" Apr 21 04:41:12.444017 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.443998 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:41:12.508730 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.508695 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" Apr 21 04:41:12.531618 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.531590 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zwvsk"] Apr 21 04:41:12.535021 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:12.534997 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1202724e_b9b9_4a9b_893c_a0fd11838120.slice/crio-a394914ce9938421937334d55ab06e522c4c7ff31dac7eb998112e5fc5279656 WatchSource:0}: Error finding container a394914ce9938421937334d55ab06e522c4c7ff31dac7eb998112e5fc5279656: Status 404 returned error can't find the container with id a394914ce9938421937334d55ab06e522c4c7ff31dac7eb998112e5fc5279656 Apr 21 04:41:12.542159 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-registry-tls\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.542286 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542174 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-installation-pull-secrets\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.542286 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-ca-trust-extracted\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.542429 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542330 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:41:12.542429 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542373 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-trusted-ca\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.542429 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgm8j\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-kube-api-access-xgm8j\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.542658 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-bound-sa-token\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.542658 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-image-registry-private-configuration\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.542658 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-registry-certificates\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.542658 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542611 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-bound-sa-token\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:41:12.542658 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542628 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fqt2r\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-kube-api-access-fqt2r\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:41:12.542658 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542648 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-image-registry-private-configuration\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:41:12.543220 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542665 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-installation-pull-secrets\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:41:12.543220 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.542681 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-certificates\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:41:12.543887 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.543625 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-registry-certificates\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.543887 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.543826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-trusted-ca\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.544060 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.544025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-ca-trust-extracted\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.545238 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.545192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-installation-pull-secrets\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.545808 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.545780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-image-registry-private-configuration\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.546590 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.546566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"image-registry-56fc55f57c-qwxvv\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:41:12.547968 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.547925 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-registry-tls\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.550883 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.550862 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgm8j\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-kube-api-access-xgm8j\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.554055 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.554038 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3ea294c-9c51-4fc3-a684-cce4c126b2a3-bound-sa-token\") pod \"image-registry-66898969c-9dljf\" (UID: \"b3ea294c-9c51-4fc3-a684-cce4c126b2a3\") " pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.624144 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.624115 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nlhzm\"" Apr 21 04:41:12.632257 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.632228 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:12.632622 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.632600 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vx96n"] Apr 21 04:41:12.635707 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:12.635679 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5889b438_5ad7_4587_ad98_78b9ed6b52a5.slice/crio-f68328c4d04dd6d377ce00cd16f64241e6f8d2e4b148c4d0386dd80ab83bfe42 WatchSource:0}: Error finding container f68328c4d04dd6d377ce00cd16f64241e6f8d2e4b148c4d0386dd80ab83bfe42: Status 404 returned error can't find the container with id f68328c4d04dd6d377ce00cd16f64241e6f8d2e4b148c4d0386dd80ab83bfe42 Apr 21 04:41:12.744908 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.744873 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") pod \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\" (UID: \"e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e\") " Apr 21 04:41:12.746768 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.746742 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" (UID: "e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:41:12.756850 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.756823 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66898969c-9dljf"] Apr 21 04:41:12.759396 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:12.759356 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3ea294c_9c51_4fc3_a684_cce4c126b2a3.slice/crio-40e2d462a5554d2959d89e95bf9ba53393f23e3f5744a9a7a637e521f954ffcb WatchSource:0}: Error finding container 40e2d462a5554d2959d89e95bf9ba53393f23e3f5744a9a7a637e521f954ffcb: Status 404 returned error can't find the container with id 40e2d462a5554d2959d89e95bf9ba53393f23e3f5744a9a7a637e521f954ffcb Apr 21 04:41:12.845632 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:12.845602 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e-registry-tls\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:41:13.331983 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.331944 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" event={"ID":"5889b438-5ad7-4587-ad98-78b9ed6b52a5","Type":"ContainerStarted","Data":"f68328c4d04dd6d377ce00cd16f64241e6f8d2e4b148c4d0386dd80ab83bfe42"} Apr 21 04:41:13.333702 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.333673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66898969c-9dljf" event={"ID":"b3ea294c-9c51-4fc3-a684-cce4c126b2a3","Type":"ContainerStarted","Data":"8743445c77ea8dc771a73128991bf46f742afdabb7c72e19ac3b1487dca90561"} Apr 21 04:41:13.333849 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.333709 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66898969c-9dljf" event={"ID":"b3ea294c-9c51-4fc3-a684-cce4c126b2a3","Type":"ContainerStarted","Data":"40e2d462a5554d2959d89e95bf9ba53393f23e3f5744a9a7a637e521f954ffcb"} Apr 21 04:41:13.333849 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.333804 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:13.335630 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.335605 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zwvsk" event={"ID":"1202724e-b9b9-4a9b-893c-a0fd11838120","Type":"ContainerStarted","Data":"0877b81f85eee7f41e4761cd9d8df038a09cbe0b35e66579120d4bb6d5e18267"} Apr 21 04:41:13.335630 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.335623 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56fc55f57c-qwxvv" Apr 21 04:41:13.335773 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.335637 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zwvsk" event={"ID":"1202724e-b9b9-4a9b-893c-a0fd11838120","Type":"ContainerStarted","Data":"f13bf66ca3f58bf8759d497a2a35df88ca16847d35be075e52bdbc8a90c77118"} Apr 21 04:41:13.335773 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.335651 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zwvsk" event={"ID":"1202724e-b9b9-4a9b-893c-a0fd11838120","Type":"ContainerStarted","Data":"a394914ce9938421937334d55ab06e522c4c7ff31dac7eb998112e5fc5279656"} Apr 21 04:41:13.352771 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.352716 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66898969c-9dljf" podStartSLOduration=1.3527001109999999 podStartE2EDuration="1.352700111s" podCreationTimestamp="2026-04-21 04:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:41:13.351245251 +0000 UTC m=+135.977921931" watchObservedRunningTime="2026-04-21 04:41:13.352700111 +0000 UTC m=+135.979376793" Apr 21 04:41:13.392338 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.392307 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56fc55f57c-qwxvv"] Apr 21 04:41:13.395908 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.395880 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56fc55f57c-qwxvv"] Apr 21 04:41:13.896475 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:13.896441 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e" path="/var/lib/kubelet/pods/e81e5a1f-7bf2-4ecf-aa24-4259932a2a0e/volumes" Apr 21 04:41:14.342439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:14.342398 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" event={"ID":"5889b438-5ad7-4587-ad98-78b9ed6b52a5","Type":"ContainerStarted","Data":"c69f4086d126c968e935436be62f4017d30eb80dfc7f203e1f6bc0ed885c99ab"} Apr 21 04:41:14.361644 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:14.359447 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vx96n" podStartSLOduration=1.207699716 podStartE2EDuration="2.359423808s" podCreationTimestamp="2026-04-21 04:41:12 +0000 UTC" firstStartedPulling="2026-04-21 04:41:12.637558774 +0000 UTC m=+135.264235430" lastFinishedPulling="2026-04-21 04:41:13.789282863 +0000 UTC m=+136.415959522" observedRunningTime="2026-04-21 04:41:14.3588419 +0000 UTC m=+136.985518573" watchObservedRunningTime="2026-04-21 04:41:14.359423808 +0000 UTC m=+136.986100486" Apr 21 04:41:15.347031 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:15.346995 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zwvsk" event={"ID":"1202724e-b9b9-4a9b-893c-a0fd11838120","Type":"ContainerStarted","Data":"a811c27162afc3424c5283d0fdb947194f162b4516ca8d15bab4b88130ad0e7e"} Apr 21 04:41:15.364721 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:15.364667 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zwvsk" podStartSLOduration=1.392333724 podStartE2EDuration="3.364651888s" podCreationTimestamp="2026-04-21 04:41:12 +0000 UTC" firstStartedPulling="2026-04-21 04:41:12.603629372 +0000 UTC m=+135.230306035" lastFinishedPulling="2026-04-21 04:41:14.575947543 +0000 UTC m=+137.202624199" observedRunningTime="2026-04-21 04:41:15.363589415 +0000 UTC m=+137.990266094" watchObservedRunningTime="2026-04-21 04:41:15.364651888 +0000 UTC m=+137.991328557" Apr 21 04:41:16.321956 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.321918 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kk7nd"] Apr 21 04:41:16.324965 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.324946 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.327495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.327473 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 04:41:16.328536 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.328510 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-4xjwp\"" Apr 21 04:41:16.328659 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.328572 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 04:41:16.328659 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.328584 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:41:16.328659 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.328571 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 04:41:16.328659 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.328601 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 04:41:16.335603 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.335580 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kk7nd"] Apr 21 04:41:16.472942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.472905 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.473323 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.472951 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.473323 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.473004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.473323 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.473079 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vxpp\" (UniqueName: \"kubernetes.io/projected/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-kube-api-access-4vxpp\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.574079 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.573996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vxpp\" (UniqueName: \"kubernetes.io/projected/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-kube-api-access-4vxpp\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.574079 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.574068 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.574283 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.574094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.574283 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.574115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.574811 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.574790 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.576557 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.576531 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.576638 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.576600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.583466 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.583437 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vxpp\" (UniqueName: \"kubernetes.io/projected/ec960f2e-772b-471f-96bb-a0d4b9ff4f15-kube-api-access-4vxpp\") pod \"prometheus-operator-5676c8c784-kk7nd\" (UID: \"ec960f2e-772b-471f-96bb-a0d4b9ff4f15\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.634303 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.634266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" Apr 21 04:41:16.750673 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:16.750639 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kk7nd"] Apr 21 04:41:16.753782 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:16.753748 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec960f2e_772b_471f_96bb_a0d4b9ff4f15.slice/crio-d3b74fc9939843bcc27fb4c299144ef53e89b140f2bef8296ed34d47fc71d417 WatchSource:0}: Error finding container d3b74fc9939843bcc27fb4c299144ef53e89b140f2bef8296ed34d47fc71d417: Status 404 returned error can't find the container with id d3b74fc9939843bcc27fb4c299144ef53e89b140f2bef8296ed34d47fc71d417 Apr 21 04:41:17.353999 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:17.353953 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" event={"ID":"ec960f2e-772b-471f-96bb-a0d4b9ff4f15","Type":"ContainerStarted","Data":"d3b74fc9939843bcc27fb4c299144ef53e89b140f2bef8296ed34d47fc71d417"} Apr 21 04:41:18.358445 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:18.358409 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" event={"ID":"ec960f2e-772b-471f-96bb-a0d4b9ff4f15","Type":"ContainerStarted","Data":"a42078816ca9dbdde5eb6c29b70bbe96adba53a53b95b13ba78e163b830edaef"} Apr 21 04:41:18.358814 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:18.358447 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" event={"ID":"ec960f2e-772b-471f-96bb-a0d4b9ff4f15","Type":"ContainerStarted","Data":"bc49927bdedbc01f6686c2951113051552ed08d23934fe6b264585b3d7d71d0f"} Apr 21 04:41:18.374993 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:18.374929 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-kk7nd" podStartSLOduration=1.082633195 podStartE2EDuration="2.374913647s" podCreationTimestamp="2026-04-21 04:41:16 +0000 UTC" firstStartedPulling="2026-04-21 04:41:16.755556637 +0000 UTC m=+139.382233292" lastFinishedPulling="2026-04-21 04:41:18.047837078 +0000 UTC m=+140.674513744" observedRunningTime="2026-04-21 04:41:18.373581983 +0000 UTC m=+141.000258661" watchObservedRunningTime="2026-04-21 04:41:18.374913647 +0000 UTC m=+141.001590325" Apr 21 04:41:20.679920 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.679886 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-trq4k"] Apr 21 04:41:20.688471 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.688440 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d77nd"] Apr 21 04:41:20.688627 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.688595 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.690938 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.690914 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:41:20.691159 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.691140 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p4hrl\"" Apr 21 04:41:20.691233 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.691040 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:41:20.691233 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.690940 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:41:20.691513 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.691479 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.693682 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.693663 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wtt28\"" Apr 21 04:41:20.693820 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.693800 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 04:41:20.693929 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.693672 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 04:41:20.694269 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.694250 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 04:41:20.700351 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.700332 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d77nd"] Apr 21 04:41:20.705019 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.704999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlbgp\" (UniqueName: \"kubernetes.io/projected/730b7810-de68-4798-9079-e3cdd2121300-kube-api-access-wlbgp\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.705234 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705205 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.705359 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzs8\" (UniqueName: \"kubernetes.io/projected/380f7265-1a51-467a-9169-1011757d613d-kube-api-access-rxzs8\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.705475 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705459 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/730b7810-de68-4798-9079-e3cdd2121300-metrics-client-ca\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.705554 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705501 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-tls\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.705554 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705543 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-accelerators-collector-config\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.705654 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705579 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.705711 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705674 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-wtmp\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.705767 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/380f7265-1a51-467a-9169-1011757d613d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.705767 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705737 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.705857 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.705857 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-root\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.705949 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-textfile\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.705949 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705887 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/380f7265-1a51-467a-9169-1011757d613d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.705949 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.705910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-sys\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.806585 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.806541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-tls\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.806585 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.806588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-accelerators-collector-config\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.806808 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.806619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.806808 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.806673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-wtmp\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.806808 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:41:20.806793 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 04:41:20.806965 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.806805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-wtmp\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.806965 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:41:20.806871 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-tls podName:380f7265-1a51-467a-9169-1011757d613d nodeName:}" failed. No retries permitted until 2026-04-21 04:41:21.306849796 +0000 UTC m=+143.933526459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-d77nd" (UID: "380f7265-1a51-467a-9169-1011757d613d") : secret "kube-state-metrics-tls" not found Apr 21 04:41:20.806965 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.806897 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/380f7265-1a51-467a-9169-1011757d613d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.806965 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.806932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.807139 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.806989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.807139 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-root\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.807139 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-textfile\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.807139 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/380f7265-1a51-467a-9169-1011757d613d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.807139 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-sys\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.807394 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlbgp\" (UniqueName: \"kubernetes.io/projected/730b7810-de68-4798-9079-e3cdd2121300-kube-api-access-wlbgp\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.807394 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.807394 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzs8\" (UniqueName: \"kubernetes.io/projected/380f7265-1a51-467a-9169-1011757d613d-kube-api-access-rxzs8\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.807394 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/730b7810-de68-4798-9079-e3cdd2121300-metrics-client-ca\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.807394 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-accelerators-collector-config\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.807656 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-textfile\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.807816 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807794 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/730b7810-de68-4798-9079-e3cdd2121300-metrics-client-ca\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.808005 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.807983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/380f7265-1a51-467a-9169-1011757d613d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.808087 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.808071 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/380f7265-1a51-467a-9169-1011757d613d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.808145 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.808111 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.808341 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.808319 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-sys\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.808540 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.808517 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/730b7810-de68-4798-9079-e3cdd2121300-root\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.810152 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.810075 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.811267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.810395 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:20.811267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.810503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/730b7810-de68-4798-9079-e3cdd2121300-node-exporter-tls\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.815697 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.815669 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlbgp\" (UniqueName: \"kubernetes.io/projected/730b7810-de68-4798-9079-e3cdd2121300-kube-api-access-wlbgp\") pod \"node-exporter-trq4k\" (UID: \"730b7810-de68-4798-9079-e3cdd2121300\") " pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:20.816235 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:20.816208 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzs8\" (UniqueName: \"kubernetes.io/projected/380f7265-1a51-467a-9169-1011757d613d-kube-api-access-rxzs8\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:21.000716 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:21.000622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-trq4k" Apr 21 04:41:21.011567 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:21.011537 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod730b7810_de68_4798_9079_e3cdd2121300.slice/crio-98ce5160b0082537dac4c34b9ddd4f697deae9f729a1cf760c0d9a7ada2a8eff WatchSource:0}: Error finding container 98ce5160b0082537dac4c34b9ddd4f697deae9f729a1cf760c0d9a7ada2a8eff: Status 404 returned error can't find the container with id 98ce5160b0082537dac4c34b9ddd4f697deae9f729a1cf760c0d9a7ada2a8eff Apr 21 04:41:21.315206 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:21.315174 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:21.317522 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:21.317500 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/380f7265-1a51-467a-9169-1011757d613d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d77nd\" (UID: \"380f7265-1a51-467a-9169-1011757d613d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:21.367296 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:21.367257 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-trq4k" event={"ID":"730b7810-de68-4798-9079-e3cdd2121300","Type":"ContainerStarted","Data":"98ce5160b0082537dac4c34b9ddd4f697deae9f729a1cf760c0d9a7ada2a8eff"} Apr 21 04:41:21.608511 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:21.608427 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" Apr 21 04:41:21.749701 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:21.749672 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d77nd"] Apr 21 04:41:21.803336 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:21.803291 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod380f7265_1a51_467a_9169_1011757d613d.slice/crio-5f3aded5843ee92d686f2fed6e88e35aaaaaa94456400ffdf99901494a1047b8 WatchSource:0}: Error finding container 5f3aded5843ee92d686f2fed6e88e35aaaaaa94456400ffdf99901494a1047b8: Status 404 returned error can't find the container with id 5f3aded5843ee92d686f2fed6e88e35aaaaaa94456400ffdf99901494a1047b8 Apr 21 04:41:22.371645 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:22.371604 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" event={"ID":"380f7265-1a51-467a-9169-1011757d613d","Type":"ContainerStarted","Data":"5f3aded5843ee92d686f2fed6e88e35aaaaaa94456400ffdf99901494a1047b8"} Apr 21 04:41:22.373117 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:22.373089 2579 generic.go:358] "Generic (PLEG): container finished" podID="730b7810-de68-4798-9079-e3cdd2121300" containerID="160de6b2adc577b7a3f8efa99c97e5b1a43e1951c44252253c715dc38d28db9d" exitCode=0 Apr 21 04:41:22.373258 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:22.373123 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-trq4k" event={"ID":"730b7810-de68-4798-9079-e3cdd2121300","Type":"ContainerDied","Data":"160de6b2adc577b7a3f8efa99c97e5b1a43e1951c44252253c715dc38d28db9d"} Apr 21 04:41:23.377668 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:23.377630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-trq4k" event={"ID":"730b7810-de68-4798-9079-e3cdd2121300","Type":"ContainerStarted","Data":"150e3c98e65c8ad0a98fe1f6b869da8d1a6dd0e596aac8c1d18459ceb44f1c5d"} Apr 21 04:41:23.378131 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:23.377676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-trq4k" event={"ID":"730b7810-de68-4798-9079-e3cdd2121300","Type":"ContainerStarted","Data":"9f1ccb6fcf34aea41dcd3e640371a68355ce2f0bf99bc1ed9858a8574da08b4a"} Apr 21 04:41:23.379571 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:23.379535 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" event={"ID":"380f7265-1a51-467a-9169-1011757d613d","Type":"ContainerStarted","Data":"a4ca76eb9667b4130ede7bd0049ef05a21b24d0e6aceca2cf2db792cf894e3f2"} Apr 21 04:41:23.379571 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:23.379569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" event={"ID":"380f7265-1a51-467a-9169-1011757d613d","Type":"ContainerStarted","Data":"a26103e42b884e182e6c65c2c0635e4b81a1d4871f4758f90896a658dbb4c94b"} Apr 21 04:41:23.379734 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:23.379583 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" event={"ID":"380f7265-1a51-467a-9169-1011757d613d","Type":"ContainerStarted","Data":"19e42ed8686b6c09d42080178c581d6ae59d130c77471edd4059c99e1632c322"} Apr 21 04:41:23.397854 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:23.397801 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-trq4k" podStartSLOduration=2.565019424 podStartE2EDuration="3.39778557s" podCreationTimestamp="2026-04-21 04:41:20 +0000 UTC" firstStartedPulling="2026-04-21 04:41:21.01717271 +0000 UTC m=+143.643849373" lastFinishedPulling="2026-04-21 04:41:21.849938859 +0000 UTC m=+144.476615519" observedRunningTime="2026-04-21 04:41:23.395918922 +0000 UTC m=+146.022595599" watchObservedRunningTime="2026-04-21 04:41:23.39778557 +0000 UTC m=+146.024462248" Apr 21 04:41:23.413179 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:23.413132 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-d77nd" podStartSLOduration=2.387319885 podStartE2EDuration="3.413118536s" podCreationTimestamp="2026-04-21 04:41:20 +0000 UTC" firstStartedPulling="2026-04-21 04:41:21.805237304 +0000 UTC m=+144.431913961" lastFinishedPulling="2026-04-21 04:41:22.831035944 +0000 UTC m=+145.457712612" observedRunningTime="2026-04-21 04:41:23.412396885 +0000 UTC m=+146.039073563" watchObservedRunningTime="2026-04-21 04:41:23.413118536 +0000 UTC m=+146.039795214" Apr 21 04:41:24.371092 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.371058 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-7wgpt"] Apr 21 04:41:24.374286 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.374258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7wgpt" Apr 21 04:41:24.376807 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.376784 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-ntm8l\"" Apr 21 04:41:24.376968 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.376862 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 04:41:24.377151 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.377131 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 04:41:24.383458 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.383425 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7wgpt"] Apr 21 04:41:24.445454 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.445420 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtdtp\" (UniqueName: \"kubernetes.io/projected/5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5-kube-api-access-qtdtp\") pod \"downloads-6bcc868b7-7wgpt\" (UID: \"5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5\") " pod="openshift-console/downloads-6bcc868b7-7wgpt" Apr 21 04:41:24.546927 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.546871 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtdtp\" (UniqueName: \"kubernetes.io/projected/5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5-kube-api-access-qtdtp\") pod \"downloads-6bcc868b7-7wgpt\" (UID: \"5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5\") " pod="openshift-console/downloads-6bcc868b7-7wgpt" Apr 21 04:41:24.555392 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.555332 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtdtp\" (UniqueName: \"kubernetes.io/projected/5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5-kube-api-access-qtdtp\") pod \"downloads-6bcc868b7-7wgpt\" (UID: \"5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5\") " pod="openshift-console/downloads-6bcc868b7-7wgpt" Apr 21 04:41:24.685180 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.685095 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7wgpt" Apr 21 04:41:24.801122 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:24.800846 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7wgpt"] Apr 21 04:41:24.805108 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:24.805074 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce78fb5_83b3_4d9c_8aeb_db5f4df5abb5.slice/crio-ded97d93cf945f8af5916c25bc5924fb24366207cf4e42bf6ef7643d65112841 WatchSource:0}: Error finding container ded97d93cf945f8af5916c25bc5924fb24366207cf4e42bf6ef7643d65112841: Status 404 returned error can't find the container with id ded97d93cf945f8af5916c25bc5924fb24366207cf4e42bf6ef7643d65112841 Apr 21 04:41:25.061765 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.061723 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-599ff764cd-p45c5"] Apr 21 04:41:25.066345 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.066327 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.068788 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.068764 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 04:41:25.068898 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.068796 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 04:41:25.070112 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.069974 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 04:41:25.070594 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.070444 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 04:41:25.070594 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.070477 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-fqa6fg62b395f\"" Apr 21 04:41:25.070594 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.070499 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-db5vx\"" Apr 21 04:41:25.071889 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.071866 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-599ff764cd-p45c5"] Apr 21 04:41:25.151612 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.151571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-secret-metrics-server-client-certs\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.151781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.151619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/115c3833-f05a-48f9-a403-7f41408d8114-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.151781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.151690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/115c3833-f05a-48f9-a403-7f41408d8114-audit-log\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.151781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.151759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-client-ca-bundle\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.151904 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.151824 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-secret-metrics-server-tls\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.151904 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.151864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f54kp\" (UniqueName: \"kubernetes.io/projected/115c3833-f05a-48f9-a403-7f41408d8114-kube-api-access-f54kp\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.151995 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.151948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/115c3833-f05a-48f9-a403-7f41408d8114-metrics-server-audit-profiles\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.252910 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.252868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/115c3833-f05a-48f9-a403-7f41408d8114-metrics-server-audit-profiles\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.253086 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.252950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-secret-metrics-server-client-certs\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.253086 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.252984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/115c3833-f05a-48f9-a403-7f41408d8114-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.253086 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.253011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/115c3833-f05a-48f9-a403-7f41408d8114-audit-log\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.253249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.253109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-client-ca-bundle\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.253249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.253187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-secret-metrics-server-tls\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.253249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.253219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f54kp\" (UniqueName: \"kubernetes.io/projected/115c3833-f05a-48f9-a403-7f41408d8114-kube-api-access-f54kp\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.253726 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.253547 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/115c3833-f05a-48f9-a403-7f41408d8114-audit-log\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.253883 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.253802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/115c3833-f05a-48f9-a403-7f41408d8114-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.254060 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.254020 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/115c3833-f05a-48f9-a403-7f41408d8114-metrics-server-audit-profiles\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.256756 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.256732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-client-ca-bundle\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.256887 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.256725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-secret-metrics-server-tls\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.256887 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.256850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/115c3833-f05a-48f9-a403-7f41408d8114-secret-metrics-server-client-certs\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.266705 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.266673 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f54kp\" (UniqueName: \"kubernetes.io/projected/115c3833-f05a-48f9-a403-7f41408d8114-kube-api-access-f54kp\") pod \"metrics-server-599ff764cd-p45c5\" (UID: \"115c3833-f05a-48f9-a403-7f41408d8114\") " pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.377284 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.377176 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:25.387610 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.387573 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7wgpt" event={"ID":"5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5","Type":"ContainerStarted","Data":"ded97d93cf945f8af5916c25bc5924fb24366207cf4e42bf6ef7643d65112841"} Apr 21 04:41:25.438105 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.438070 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-474k4"] Apr 21 04:41:25.443871 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.443845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" Apr 21 04:41:25.450173 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.449551 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-mwchc\"" Apr 21 04:41:25.450173 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.449801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 04:41:25.451831 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.451782 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-474k4"] Apr 21 04:41:25.540406 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.540376 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-599ff764cd-p45c5"] Apr 21 04:41:25.543184 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:25.543153 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod115c3833_f05a_48f9_a403_7f41408d8114.slice/crio-fb8e0c57ab9188202119a084f88b799e6fc0327746e79eb2ec551fc4bad71855 WatchSource:0}: Error finding container fb8e0c57ab9188202119a084f88b799e6fc0327746e79eb2ec551fc4bad71855: Status 404 returned error can't find the container with id fb8e0c57ab9188202119a084f88b799e6fc0327746e79eb2ec551fc4bad71855 Apr 21 04:41:25.558230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.558202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-474k4\" (UID: \"a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" Apr 21 04:41:25.659679 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.659456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-474k4\" (UID: \"a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" Apr 21 04:41:25.659679 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:41:25.659638 2579 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 04:41:25.659896 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:41:25.659730 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23-monitoring-plugin-cert podName:a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:26.159708747 +0000 UTC m=+148.786385403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-474k4" (UID: "a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23") : secret "monitoring-plugin-cert" not found Apr 21 04:41:25.881025 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.880991 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6f67867bb5-f4rv6"] Apr 21 04:41:25.884818 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.884793 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:25.887484 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.887452 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 04:41:25.887633 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.887526 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 04:41:25.887633 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.887547 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 04:41:25.887633 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.887615 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 04:41:25.887633 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.887551 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 04:41:25.887897 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.887807 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-tqvbz\"" Apr 21 04:41:25.893260 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.893077 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 04:41:25.899417 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.899393 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f67867bb5-f4rv6"] Apr 21 04:41:25.961584 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.961462 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-secret-telemeter-client\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:25.961765 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.961582 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbfr7\" (UniqueName: \"kubernetes.io/projected/0fd5110b-1359-44b2-ba72-c15680c47476-kube-api-access-fbfr7\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:25.961765 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.961626 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-metrics-client-ca\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:25.961765 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.961716 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-serving-certs-ca-bundle\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:25.961920 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.961756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-telemeter-client-tls\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:25.961920 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.961825 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:25.962019 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.961929 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-federate-client-tls\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:25.962019 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:25.961990 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.063220 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.063135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.063427 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.063245 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-secret-telemeter-client\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.063427 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.063304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbfr7\" (UniqueName: \"kubernetes.io/projected/0fd5110b-1359-44b2-ba72-c15680c47476-kube-api-access-fbfr7\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.063427 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.063339 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-metrics-client-ca\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.063427 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.063397 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-serving-certs-ca-bundle\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.063623 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.063438 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-telemeter-client-tls\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.063623 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.063470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.063623 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.063531 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-federate-client-tls\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.065267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.064924 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.066038 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.065991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-serving-certs-ca-bundle\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.066446 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.066408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fd5110b-1359-44b2-ba72-c15680c47476-metrics-client-ca\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.067344 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.067300 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-secret-telemeter-client\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.068128 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.068095 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-telemeter-client-tls\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.068327 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.068172 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-federate-client-tls\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.069770 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.069727 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fd5110b-1359-44b2-ba72-c15680c47476-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.078550 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.078504 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbfr7\" (UniqueName: \"kubernetes.io/projected/0fd5110b-1359-44b2-ba72-c15680c47476-kube-api-access-fbfr7\") pod \"telemeter-client-6f67867bb5-f4rv6\" (UID: \"0fd5110b-1359-44b2-ba72-c15680c47476\") " pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.164980 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.164937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-474k4\" (UID: \"a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" Apr 21 04:41:26.168302 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.168241 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-474k4\" (UID: \"a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" Apr 21 04:41:26.199198 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.199159 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" Apr 21 04:41:26.349770 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.349736 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f67867bb5-f4rv6"] Apr 21 04:41:26.353292 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:26.353251 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fd5110b_1359_44b2_ba72_c15680c47476.slice/crio-789ba72dbf163362f59995be9350da6b9747dd0a325e35f821fa7cbbad8e2be2 WatchSource:0}: Error finding container 789ba72dbf163362f59995be9350da6b9747dd0a325e35f821fa7cbbad8e2be2: Status 404 returned error can't find the container with id 789ba72dbf163362f59995be9350da6b9747dd0a325e35f821fa7cbbad8e2be2 Apr 21 04:41:26.370048 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.370018 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" Apr 21 04:41:26.392694 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.392628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" event={"ID":"115c3833-f05a-48f9-a403-7f41408d8114","Type":"ContainerStarted","Data":"fb8e0c57ab9188202119a084f88b799e6fc0327746e79eb2ec551fc4bad71855"} Apr 21 04:41:26.394180 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.394148 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" event={"ID":"0fd5110b-1359-44b2-ba72-c15680c47476","Type":"ContainerStarted","Data":"789ba72dbf163362f59995be9350da6b9747dd0a325e35f821fa7cbbad8e2be2"} Apr 21 04:41:26.511912 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:26.511816 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-474k4"] Apr 21 04:41:27.008881 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:27.008827 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1eaa9fb_25fe_4e1f_a838_4fcbae8a5c23.slice/crio-ef5d91bd7dfa29cbaaeeb090ca81decd9733853d556a3249387c6bcfbd3d515a WatchSource:0}: Error finding container ef5d91bd7dfa29cbaaeeb090ca81decd9733853d556a3249387c6bcfbd3d515a: Status 404 returned error can't find the container with id ef5d91bd7dfa29cbaaeeb090ca81decd9733853d556a3249387c6bcfbd3d515a Apr 21 04:41:27.398915 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:27.398860 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" event={"ID":"a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23","Type":"ContainerStarted","Data":"ef5d91bd7dfa29cbaaeeb090ca81decd9733853d556a3249387c6bcfbd3d515a"} Apr 21 04:41:27.405690 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:27.405641 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" event={"ID":"115c3833-f05a-48f9-a403-7f41408d8114","Type":"ContainerStarted","Data":"6a542b5e4e0c06255dab211cbad2fbec3c3dabc431ec6cf2ada636cfb9b3267d"} Apr 21 04:41:27.427721 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:27.427059 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" podStartSLOduration=0.891909382 podStartE2EDuration="2.42704015s" podCreationTimestamp="2026-04-21 04:41:25 +0000 UTC" firstStartedPulling="2026-04-21 04:41:25.545091064 +0000 UTC m=+148.171767720" lastFinishedPulling="2026-04-21 04:41:27.08022183 +0000 UTC m=+149.706898488" observedRunningTime="2026-04-21 04:41:27.425875245 +0000 UTC m=+150.052551924" watchObservedRunningTime="2026-04-21 04:41:27.42704015 +0000 UTC m=+150.053716828" Apr 21 04:41:29.415036 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:29.414992 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" event={"ID":"0fd5110b-1359-44b2-ba72-c15680c47476","Type":"ContainerStarted","Data":"32411d44b8a7c025071cfbc7bcc2cd1200ea4c4bf4b340b45ddfd42b01e02d59"} Apr 21 04:41:29.416941 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:29.416888 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" event={"ID":"a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23","Type":"ContainerStarted","Data":"54311f806a596e0ae5ae8b25d22a87ed3269513be5703dfc3ba8662f431393c0"} Apr 21 04:41:29.417719 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:29.417212 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" Apr 21 04:41:29.423923 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:29.423891 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" Apr 21 04:41:29.432498 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:29.432443 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-474k4" podStartSLOduration=2.5185122890000002 podStartE2EDuration="4.432425989s" podCreationTimestamp="2026-04-21 04:41:25 +0000 UTC" firstStartedPulling="2026-04-21 04:41:27.011093602 +0000 UTC m=+149.637770262" lastFinishedPulling="2026-04-21 04:41:28.925007291 +0000 UTC m=+151.551683962" observedRunningTime="2026-04-21 04:41:29.432069167 +0000 UTC m=+152.058745844" watchObservedRunningTime="2026-04-21 04:41:29.432425989 +0000 UTC m=+152.059102668" Apr 21 04:41:30.422461 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:30.422426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" event={"ID":"0fd5110b-1359-44b2-ba72-c15680c47476","Type":"ContainerStarted","Data":"1cb2ae13f843cea76c6c4a5bb003bcd3ebeb2aa2883d4660e8766fd7819ba978"} Apr 21 04:41:30.422933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:30.422468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" event={"ID":"0fd5110b-1359-44b2-ba72-c15680c47476","Type":"ContainerStarted","Data":"26b3d8863b144e6d66559b3e337dbcf3279ea20c2892ffd6610a2fb175e09885"} Apr 21 04:41:30.445033 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:30.444959 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6f67867bb5-f4rv6" podStartSLOduration=1.780720463 podStartE2EDuration="5.444938997s" podCreationTimestamp="2026-04-21 04:41:25 +0000 UTC" firstStartedPulling="2026-04-21 04:41:26.355677492 +0000 UTC m=+148.982354162" lastFinishedPulling="2026-04-21 04:41:30.01989604 +0000 UTC m=+152.646572696" observedRunningTime="2026-04-21 04:41:30.441993207 +0000 UTC m=+153.068669885" watchObservedRunningTime="2026-04-21 04:41:30.444938997 +0000 UTC m=+153.071615676" Apr 21 04:41:32.636859 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:32.636822 2579 patch_prober.go:28] interesting pod/image-registry-66898969c-9dljf container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 04:41:32.637274 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:32.636878 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66898969c-9dljf" podUID="b3ea294c-9c51-4fc3-a684-cce4c126b2a3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:41:33.839462 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:41:33.839412 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vk48w" podUID="56593aaa-f779-4c98-94da-5b75ed6e9124" Apr 21 04:41:33.851635 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:41:33.851595 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-j8jtk" podUID="f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2" Apr 21 04:41:34.348325 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:34.348292 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66898969c-9dljf" Apr 21 04:41:34.434314 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:34.434276 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vk48w" Apr 21 04:41:35.145287 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.145248 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59f74db4bf-g578v"] Apr 21 04:41:35.149871 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.149842 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.152535 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.152494 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 04:41:35.152651 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.152603 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gbfzr\"" Apr 21 04:41:35.153642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.153621 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 04:41:35.153754 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.153691 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 04:41:35.153754 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.153702 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 04:41:35.153866 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.153750 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 04:41:35.158163 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.157788 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59f74db4bf-g578v"] Apr 21 04:41:35.259836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.259787 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-console-config\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.260012 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.259900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-oauth-config\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.260012 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.259936 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-serving-cert\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.260012 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.259971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-service-ca\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.260012 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.259997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tf6c\" (UniqueName: \"kubernetes.io/projected/01810937-538e-48fa-933b-413bee53f6d8-kube-api-access-6tf6c\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.260209 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.260109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-oauth-serving-cert\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.361408 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.361342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-service-ca\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.361408 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.361413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tf6c\" (UniqueName: \"kubernetes.io/projected/01810937-538e-48fa-933b-413bee53f6d8-kube-api-access-6tf6c\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.361642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.361440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-oauth-serving-cert\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.361642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.361515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-console-config\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.361642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.361583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-oauth-config\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.361642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.361607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-serving-cert\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.362184 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.362158 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-service-ca\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.362310 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.362196 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-console-config\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.362310 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.362279 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-oauth-serving-cert\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.364320 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.364296 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-serving-cert\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.364428 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.364304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-oauth-config\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.369804 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.369777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tf6c\" (UniqueName: \"kubernetes.io/projected/01810937-538e-48fa-933b-413bee53f6d8-kube-api-access-6tf6c\") pod \"console-59f74db4bf-g578v\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:35.461617 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:35.461522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:38.694186 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:38.694146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:41:38.696869 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:38.696839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/56593aaa-f779-4c98-94da-5b75ed6e9124-metrics-tls\") pod \"dns-default-vk48w\" (UID: \"56593aaa-f779-4c98-94da-5b75ed6e9124\") " pod="openshift-dns/dns-default-vk48w" Apr 21 04:41:38.795345 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:38.795304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:41:38.798222 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:38.798195 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2-cert\") pod \"ingress-canary-j8jtk\" (UID: \"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2\") " pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:41:38.937644 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:38.937611 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f9n52\"" Apr 21 04:41:38.945759 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:38.945680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vk48w" Apr 21 04:41:41.269529 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.269498 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59f74db4bf-g578v"] Apr 21 04:41:41.274199 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:41.274162 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01810937_538e_48fa_933b_413bee53f6d8.slice/crio-1df1f0bfe76cff07ca8ffe3446698ea411edecab3590f0fb3b84a4267531abf3 WatchSource:0}: Error finding container 1df1f0bfe76cff07ca8ffe3446698ea411edecab3590f0fb3b84a4267531abf3: Status 404 returned error can't find the container with id 1df1f0bfe76cff07ca8ffe3446698ea411edecab3590f0fb3b84a4267531abf3 Apr 21 04:41:41.282048 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.281977 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vk48w"] Apr 21 04:41:41.298526 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:41.298491 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56593aaa_f779_4c98_94da_5b75ed6e9124.slice/crio-e9c1271e2fc85dff62bf1efe92937b2c44bc4af7be73ad11e935388746599117 WatchSource:0}: Error finding container e9c1271e2fc85dff62bf1efe92937b2c44bc4af7be73ad11e935388746599117: Status 404 returned error can't find the container with id e9c1271e2fc85dff62bf1efe92937b2c44bc4af7be73ad11e935388746599117 Apr 21 04:41:41.456350 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.456251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f74db4bf-g578v" event={"ID":"01810937-538e-48fa-933b-413bee53f6d8","Type":"ContainerStarted","Data":"1df1f0bfe76cff07ca8ffe3446698ea411edecab3590f0fb3b84a4267531abf3"} Apr 21 04:41:41.457546 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.457508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vk48w" event={"ID":"56593aaa-f779-4c98-94da-5b75ed6e9124","Type":"ContainerStarted","Data":"e9c1271e2fc85dff62bf1efe92937b2c44bc4af7be73ad11e935388746599117"} Apr 21 04:41:41.459002 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.458975 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7wgpt" event={"ID":"5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5","Type":"ContainerStarted","Data":"37c8a346ae07e50ceca45f88146c312c7fc5d548fccc44888ce2cbb5e35115aa"} Apr 21 04:41:41.459201 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.459176 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-7wgpt" Apr 21 04:41:41.460769 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.460743 2579 patch_prober.go:28] interesting pod/downloads-6bcc868b7-7wgpt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.18:8080/\": dial tcp 10.133.0.18:8080: connect: connection refused" start-of-body= Apr 21 04:41:41.460873 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.460799 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-7wgpt" podUID="5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.18:8080/\": dial tcp 10.133.0.18:8080: connect: connection refused" Apr 21 04:41:41.480258 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:41.480193 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-7wgpt" podStartSLOduration=1.052415684 podStartE2EDuration="17.480178142s" podCreationTimestamp="2026-04-21 04:41:24 +0000 UTC" firstStartedPulling="2026-04-21 04:41:24.807070238 +0000 UTC m=+147.433746894" lastFinishedPulling="2026-04-21 04:41:41.234832692 +0000 UTC m=+163.861509352" observedRunningTime="2026-04-21 04:41:41.478439665 +0000 UTC m=+164.105116345" watchObservedRunningTime="2026-04-21 04:41:41.480178142 +0000 UTC m=+164.106854821" Apr 21 04:41:42.470395 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:42.470150 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-7wgpt" Apr 21 04:41:43.881502 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:43.881428 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c9597f7b8-nwszm"] Apr 21 04:41:43.903893 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:43.903852 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:43.910612 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:43.910581 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c9597f7b8-nwszm"] Apr 21 04:41:43.912081 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:43.912053 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 04:41:44.043964 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.043921 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-config\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.044155 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.043984 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-service-ca\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.044155 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.044053 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-serving-cert\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.044155 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.044096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-oauth-config\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.044340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.044203 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-trusted-ca-bundle\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.044340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.044232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-oauth-serving-cert\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.044340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.044316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9pf\" (UniqueName: \"kubernetes.io/projected/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-kube-api-access-sb9pf\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.145804 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.145410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9pf\" (UniqueName: \"kubernetes.io/projected/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-kube-api-access-sb9pf\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.145804 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.145474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-config\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.145804 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.145518 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-service-ca\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.145804 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.145555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-serving-cert\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.145804 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.145586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-oauth-config\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.145804 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.145643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-trusted-ca-bundle\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.145804 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.145667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-oauth-serving-cert\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.147587 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.146991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-oauth-serving-cert\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.147587 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.147161 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-trusted-ca-bundle\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.147587 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.147550 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-service-ca\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.149242 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.149218 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-serving-cert\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.149699 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.149653 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-oauth-config\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.152258 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.152236 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-config\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.154653 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.154614 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9pf\" (UniqueName: \"kubernetes.io/projected/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-kube-api-access-sb9pf\") pod \"console-5c9597f7b8-nwszm\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.218464 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.217988 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:44.470870 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:44.470753 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vk48w" event={"ID":"56593aaa-f779-4c98-94da-5b75ed6e9124","Type":"ContainerStarted","Data":"918a4cde287af3b0a05c62077a5984a51a564a1d6ca535bfa0e378a85fbe3c5b"} Apr 21 04:41:45.021220 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.021170 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c9597f7b8-nwszm"] Apr 21 04:41:45.182231 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:45.182123 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd443b983_7a46_46fe_aa3d_dfd9c54cdf54.slice/crio-8d35ad4ef94786ee3906af256f7b31173cf2422faf4215850bb28c2502ab3d98 WatchSource:0}: Error finding container 8d35ad4ef94786ee3906af256f7b31173cf2422faf4215850bb28c2502ab3d98: Status 404 returned error can't find the container with id 8d35ad4ef94786ee3906af256f7b31173cf2422faf4215850bb28c2502ab3d98 Apr 21 04:41:45.377387 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.377330 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:45.377510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.377396 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:41:45.475833 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.475737 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f74db4bf-g578v" event={"ID":"01810937-538e-48fa-933b-413bee53f6d8","Type":"ContainerStarted","Data":"003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd"} Apr 21 04:41:45.478074 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.478039 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vk48w" event={"ID":"56593aaa-f779-4c98-94da-5b75ed6e9124","Type":"ContainerStarted","Data":"af841e094e4a6a724f55a1d3dcfa7ae5e91423d5301e27bb3c653f12eb55e6ff"} Apr 21 04:41:45.478225 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.478144 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vk48w" Apr 21 04:41:45.479610 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.479583 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c9597f7b8-nwszm" event={"ID":"d443b983-7a46-46fe-aa3d-dfd9c54cdf54","Type":"ContainerStarted","Data":"0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147"} Apr 21 04:41:45.479741 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.479614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c9597f7b8-nwszm" event={"ID":"d443b983-7a46-46fe-aa3d-dfd9c54cdf54","Type":"ContainerStarted","Data":"8d35ad4ef94786ee3906af256f7b31173cf2422faf4215850bb28c2502ab3d98"} Apr 21 04:41:45.497220 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.497170 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59f74db4bf-g578v" podStartSLOduration=6.558610093 podStartE2EDuration="10.49715323s" podCreationTimestamp="2026-04-21 04:41:35 +0000 UTC" firstStartedPulling="2026-04-21 04:41:41.276216889 +0000 UTC m=+163.902893544" lastFinishedPulling="2026-04-21 04:41:45.214760014 +0000 UTC m=+167.841436681" observedRunningTime="2026-04-21 04:41:45.49543385 +0000 UTC m=+168.122110529" watchObservedRunningTime="2026-04-21 04:41:45.49715323 +0000 UTC m=+168.123829908" Apr 21 04:41:45.514653 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.514589 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vk48w" podStartSLOduration=133.485762646 podStartE2EDuration="2m15.514569305s" podCreationTimestamp="2026-04-21 04:39:30 +0000 UTC" firstStartedPulling="2026-04-21 04:41:41.300605849 +0000 UTC m=+163.927282510" lastFinishedPulling="2026-04-21 04:41:43.3294125 +0000 UTC m=+165.956089169" observedRunningTime="2026-04-21 04:41:45.513959119 +0000 UTC m=+168.140635798" watchObservedRunningTime="2026-04-21 04:41:45.514569305 +0000 UTC m=+168.141245983" Apr 21 04:41:45.534481 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:45.534422 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c9597f7b8-nwszm" podStartSLOduration=2.534402338 podStartE2EDuration="2.534402338s" podCreationTimestamp="2026-04-21 04:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:41:45.533417142 +0000 UTC m=+168.160093832" watchObservedRunningTime="2026-04-21 04:41:45.534402338 +0000 UTC m=+168.161079016" Apr 21 04:41:46.892214 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:46.892175 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:41:46.895136 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:46.895106 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ljx4n\"" Apr 21 04:41:46.903460 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:46.903423 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-j8jtk" Apr 21 04:41:47.043413 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:47.043316 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-j8jtk"] Apr 21 04:41:47.045988 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:41:47.045950 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e3ff29_fd8e_4e1d_a4ed_a25ba7dbd7d2.slice/crio-3fab3af90aee149e537d3650a6d5cf5fb816da0077d8b6a5ad78a110c8594453 WatchSource:0}: Error finding container 3fab3af90aee149e537d3650a6d5cf5fb816da0077d8b6a5ad78a110c8594453: Status 404 returned error can't find the container with id 3fab3af90aee149e537d3650a6d5cf5fb816da0077d8b6a5ad78a110c8594453 Apr 21 04:41:47.487178 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:47.487133 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-j8jtk" event={"ID":"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2","Type":"ContainerStarted","Data":"3fab3af90aee149e537d3650a6d5cf5fb816da0077d8b6a5ad78a110c8594453"} Apr 21 04:41:49.496474 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:49.496379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-j8jtk" event={"ID":"f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2","Type":"ContainerStarted","Data":"00b81f79c5765ae2e34ecb0ee1e27a57e236eeb918504e16b5e88235b9251f25"} Apr 21 04:41:49.513242 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:49.513190 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-j8jtk" podStartSLOduration=137.349777386 podStartE2EDuration="2m19.513173009s" podCreationTimestamp="2026-04-21 04:39:30 +0000 UTC" firstStartedPulling="2026-04-21 04:41:47.048510287 +0000 UTC m=+169.675186967" lastFinishedPulling="2026-04-21 04:41:49.211905928 +0000 UTC m=+171.838582590" observedRunningTime="2026-04-21 04:41:49.510898113 +0000 UTC m=+172.137574831" watchObservedRunningTime="2026-04-21 04:41:49.513173009 +0000 UTC m=+172.139849688" Apr 21 04:41:54.218741 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:54.218704 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:54.219296 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:54.218757 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:54.223515 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:54.223490 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:54.517430 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:54.517332 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:41:54.562188 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:54.562157 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59f74db4bf-g578v"] Apr 21 04:41:55.462258 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:55.462217 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:41:55.485388 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:41:55.485333 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vk48w" Apr 21 04:42:05.383743 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:05.383710 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:42:05.387674 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:05.387646 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-599ff764cd-p45c5" Apr 21 04:42:18.582994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:18.582956 2579 generic.go:358] "Generic (PLEG): container finished" podID="c006d753-e048-4aef-a851-7a8ec3111def" containerID="6691ac454d40d93a3ac0e68ea4f2a86938470547f514182873bbbc04c0e790cc" exitCode=0 Apr 21 04:42:18.583389 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:18.583009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" event={"ID":"c006d753-e048-4aef-a851-7a8ec3111def","Type":"ContainerDied","Data":"6691ac454d40d93a3ac0e68ea4f2a86938470547f514182873bbbc04c0e790cc"} Apr 21 04:42:18.583439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:18.583403 2579 scope.go:117] "RemoveContainer" containerID="6691ac454d40d93a3ac0e68ea4f2a86938470547f514182873bbbc04c0e790cc" Apr 21 04:42:19.580981 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.580932 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59f74db4bf-g578v" podUID="01810937-538e-48fa-933b-413bee53f6d8" containerName="console" containerID="cri-o://003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd" gracePeriod=15 Apr 21 04:42:19.587973 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.587944 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-phgkm" event={"ID":"c006d753-e048-4aef-a851-7a8ec3111def","Type":"ContainerStarted","Data":"c0d5bb2194d94783c6850ba1cc0a675f47d195af050fea935fb677a8694117c5"} Apr 21 04:42:19.846150 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.846129 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59f74db4bf-g578v_01810937-538e-48fa-933b-413bee53f6d8/console/0.log" Apr 21 04:42:19.846264 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.846192 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:42:19.953017 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.952984 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-service-ca\") pod \"01810937-538e-48fa-933b-413bee53f6d8\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " Apr 21 04:42:19.953183 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953031 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-oauth-serving-cert\") pod \"01810937-538e-48fa-933b-413bee53f6d8\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " Apr 21 04:42:19.953183 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953080 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-console-config\") pod \"01810937-538e-48fa-933b-413bee53f6d8\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " Apr 21 04:42:19.953183 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953104 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-oauth-config\") pod \"01810937-538e-48fa-933b-413bee53f6d8\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " Apr 21 04:42:19.953183 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953128 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tf6c\" (UniqueName: \"kubernetes.io/projected/01810937-538e-48fa-933b-413bee53f6d8-kube-api-access-6tf6c\") pod \"01810937-538e-48fa-933b-413bee53f6d8\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " Apr 21 04:42:19.953387 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953291 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-serving-cert\") pod \"01810937-538e-48fa-933b-413bee53f6d8\" (UID: \"01810937-538e-48fa-933b-413bee53f6d8\") " Apr 21 04:42:19.953518 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953492 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "01810937-538e-48fa-933b-413bee53f6d8" (UID: "01810937-538e-48fa-933b-413bee53f6d8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:19.953518 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953500 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-service-ca" (OuterVolumeSpecName: "service-ca") pod "01810937-538e-48fa-933b-413bee53f6d8" (UID: "01810937-538e-48fa-933b-413bee53f6d8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:19.953624 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953504 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-console-config" (OuterVolumeSpecName: "console-config") pod "01810937-538e-48fa-933b-413bee53f6d8" (UID: "01810937-538e-48fa-933b-413bee53f6d8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:19.953624 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953613 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-service-ca\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:42:19.953727 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953638 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-oauth-serving-cert\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:42:19.953727 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.953655 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01810937-538e-48fa-933b-413bee53f6d8-console-config\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:42:19.955588 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.955559 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "01810937-538e-48fa-933b-413bee53f6d8" (UID: "01810937-538e-48fa-933b-413bee53f6d8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:19.955588 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.955578 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "01810937-538e-48fa-933b-413bee53f6d8" (UID: "01810937-538e-48fa-933b-413bee53f6d8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:19.955704 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:19.955598 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01810937-538e-48fa-933b-413bee53f6d8-kube-api-access-6tf6c" (OuterVolumeSpecName: "kube-api-access-6tf6c") pod "01810937-538e-48fa-933b-413bee53f6d8" (UID: "01810937-538e-48fa-933b-413bee53f6d8"). InnerVolumeSpecName "kube-api-access-6tf6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:42:20.054943 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.054914 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-oauth-config\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:42:20.054943 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.054938 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6tf6c\" (UniqueName: \"kubernetes.io/projected/01810937-538e-48fa-933b-413bee53f6d8-kube-api-access-6tf6c\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:42:20.054943 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.054950 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01810937-538e-48fa-933b-413bee53f6d8-console-serving-cert\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:42:20.592014 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.591978 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59f74db4bf-g578v_01810937-538e-48fa-933b-413bee53f6d8/console/0.log" Apr 21 04:42:20.592527 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.592032 2579 generic.go:358] "Generic (PLEG): container finished" podID="01810937-538e-48fa-933b-413bee53f6d8" containerID="003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd" exitCode=2 Apr 21 04:42:20.592527 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.592122 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f74db4bf-g578v" event={"ID":"01810937-538e-48fa-933b-413bee53f6d8","Type":"ContainerDied","Data":"003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd"} Apr 21 04:42:20.592527 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.592149 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f74db4bf-g578v" Apr 21 04:42:20.592527 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.592173 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f74db4bf-g578v" event={"ID":"01810937-538e-48fa-933b-413bee53f6d8","Type":"ContainerDied","Data":"1df1f0bfe76cff07ca8ffe3446698ea411edecab3590f0fb3b84a4267531abf3"} Apr 21 04:42:20.592527 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.592190 2579 scope.go:117] "RemoveContainer" containerID="003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd" Apr 21 04:42:20.601419 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.601398 2579 scope.go:117] "RemoveContainer" containerID="003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd" Apr 21 04:42:20.601737 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:42:20.601705 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd\": container with ID starting with 003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd not found: ID does not exist" containerID="003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd" Apr 21 04:42:20.601827 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.601751 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd"} err="failed to get container status \"003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd\": rpc error: code = NotFound desc = could not find container \"003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd\": container with ID starting with 003706aa5ef624cc4536ee5f64ed200e1fa14ea95de7bc05b2bf605f415548dd not found: ID does not exist" Apr 21 04:42:20.613830 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.613798 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59f74db4bf-g578v"] Apr 21 04:42:20.620183 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:20.620148 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59f74db4bf-g578v"] Apr 21 04:42:21.896673 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:21.896636 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01810937-538e-48fa-933b-413bee53f6d8" path="/var/lib/kubelet/pods/01810937-538e-48fa-933b-413bee53f6d8/volumes" Apr 21 04:42:22.599842 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:22.599809 2579 generic.go:358] "Generic (PLEG): container finished" podID="30ec7bfa-4b0d-470f-912f-87600811562b" containerID="43c5ba5665d89e73cd69eb89a746e145b5e97699e2fd55a15e97298991e24ba1" exitCode=0 Apr 21 04:42:22.600008 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:22.599859 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" event={"ID":"30ec7bfa-4b0d-470f-912f-87600811562b","Type":"ContainerDied","Data":"43c5ba5665d89e73cd69eb89a746e145b5e97699e2fd55a15e97298991e24ba1"} Apr 21 04:42:22.600221 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:22.600206 2579 scope.go:117] "RemoveContainer" containerID="43c5ba5665d89e73cd69eb89a746e145b5e97699e2fd55a15e97298991e24ba1" Apr 21 04:42:23.604571 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:23.604530 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s5bdm" event={"ID":"30ec7bfa-4b0d-470f-912f-87600811562b","Type":"ContainerStarted","Data":"cc186c1a52d46228ac981519f4be3ef6772930432fa3b49ceee9cb180fb9b5e3"} Apr 21 04:42:39.549711 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.549678 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8678864d96-bhm72"] Apr 21 04:42:39.550062 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.550006 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01810937-538e-48fa-933b-413bee53f6d8" containerName="console" Apr 21 04:42:39.550062 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.550018 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="01810937-538e-48fa-933b-413bee53f6d8" containerName="console" Apr 21 04:42:39.550130 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.550069 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="01810937-538e-48fa-933b-413bee53f6d8" containerName="console" Apr 21 04:42:39.555404 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.555378 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.565528 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.565500 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8678864d96-bhm72"] Apr 21 04:42:39.732928 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.732889 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-oauth-config\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.732928 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.732929 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-trusted-ca-bundle\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.733141 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.732964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-serving-cert\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.733141 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.732985 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzqz\" (UniqueName: \"kubernetes.io/projected/2e7fbddb-2697-437d-82e4-8195343dbb73-kube-api-access-bbzqz\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.733141 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.733004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-oauth-serving-cert\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.733141 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.733036 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-service-ca\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.733141 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.733087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-console-config\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.833834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.833749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-oauth-config\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.833834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.833790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-trusted-ca-bundle\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.833834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.833809 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-serving-cert\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.833834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.833826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzqz\" (UniqueName: \"kubernetes.io/projected/2e7fbddb-2697-437d-82e4-8195343dbb73-kube-api-access-bbzqz\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.834110 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.833853 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-oauth-serving-cert\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.834110 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.833898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-service-ca\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.834110 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.833939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-console-config\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.834718 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.834690 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-service-ca\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.834718 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.834706 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-oauth-serving-cert\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.834850 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.834801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-console-config\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.834850 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.834829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-trusted-ca-bundle\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.836446 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.836420 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-oauth-config\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.836523 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.836449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-serving-cert\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.841307 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.841288 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzqz\" (UniqueName: \"kubernetes.io/projected/2e7fbddb-2697-437d-82e4-8195343dbb73-kube-api-access-bbzqz\") pod \"console-8678864d96-bhm72\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.865073 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.865042 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:39.995419 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:39.995387 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8678864d96-bhm72"] Apr 21 04:42:39.997794 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:42:39.997763 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e7fbddb_2697_437d_82e4_8195343dbb73.slice/crio-6626196f459eccfcca63ce4d50184e1ebfc8cc7aa116027a4c6ca0557a8ccc59 WatchSource:0}: Error finding container 6626196f459eccfcca63ce4d50184e1ebfc8cc7aa116027a4c6ca0557a8ccc59: Status 404 returned error can't find the container with id 6626196f459eccfcca63ce4d50184e1ebfc8cc7aa116027a4c6ca0557a8ccc59 Apr 21 04:42:40.654212 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:40.654172 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8678864d96-bhm72" event={"ID":"2e7fbddb-2697-437d-82e4-8195343dbb73","Type":"ContainerStarted","Data":"ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1"} Apr 21 04:42:40.654212 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:40.654216 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8678864d96-bhm72" event={"ID":"2e7fbddb-2697-437d-82e4-8195343dbb73","Type":"ContainerStarted","Data":"6626196f459eccfcca63ce4d50184e1ebfc8cc7aa116027a4c6ca0557a8ccc59"} Apr 21 04:42:40.675018 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:40.674965 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8678864d96-bhm72" podStartSLOduration=1.674947998 podStartE2EDuration="1.674947998s" podCreationTimestamp="2026-04-21 04:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:42:40.673802764 +0000 UTC m=+223.300479443" watchObservedRunningTime="2026-04-21 04:42:40.674947998 +0000 UTC m=+223.301624676" Apr 21 04:42:49.865199 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:49.865157 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:49.865654 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:49.865212 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:49.870021 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:49.869997 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:50.687627 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:50.687600 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:42:50.741794 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:42:50.741760 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c9597f7b8-nwszm"] Apr 21 04:43:15.760944 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:15.760902 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c9597f7b8-nwszm" podUID="d443b983-7a46-46fe-aa3d-dfd9c54cdf54" containerName="console" containerID="cri-o://0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147" gracePeriod=15 Apr 21 04:43:16.003882 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.003859 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c9597f7b8-nwszm_d443b983-7a46-46fe-aa3d-dfd9c54cdf54/console/0.log" Apr 21 04:43:16.004015 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.003922 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:43:16.140047 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140009 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-oauth-config\") pod \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " Apr 21 04:43:16.140230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140079 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-config\") pod \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " Apr 21 04:43:16.140230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140117 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-serving-cert\") pod \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " Apr 21 04:43:16.140230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140156 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-service-ca\") pod \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " Apr 21 04:43:16.140230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140190 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb9pf\" (UniqueName: \"kubernetes.io/projected/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-kube-api-access-sb9pf\") pod \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " Apr 21 04:43:16.140230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140213 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-oauth-serving-cert\") pod \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " Apr 21 04:43:16.140504 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140254 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-trusted-ca-bundle\") pod \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\" (UID: \"d443b983-7a46-46fe-aa3d-dfd9c54cdf54\") " Apr 21 04:43:16.140686 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140650 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-service-ca" (OuterVolumeSpecName: "service-ca") pod "d443b983-7a46-46fe-aa3d-dfd9c54cdf54" (UID: "d443b983-7a46-46fe-aa3d-dfd9c54cdf54"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:43:16.140813 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140695 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-config" (OuterVolumeSpecName: "console-config") pod "d443b983-7a46-46fe-aa3d-dfd9c54cdf54" (UID: "d443b983-7a46-46fe-aa3d-dfd9c54cdf54"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:43:16.140813 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140719 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d443b983-7a46-46fe-aa3d-dfd9c54cdf54" (UID: "d443b983-7a46-46fe-aa3d-dfd9c54cdf54"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:43:16.140813 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.140788 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d443b983-7a46-46fe-aa3d-dfd9c54cdf54" (UID: "d443b983-7a46-46fe-aa3d-dfd9c54cdf54"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:43:16.142270 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.142241 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d443b983-7a46-46fe-aa3d-dfd9c54cdf54" (UID: "d443b983-7a46-46fe-aa3d-dfd9c54cdf54"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:43:16.142270 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.142260 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d443b983-7a46-46fe-aa3d-dfd9c54cdf54" (UID: "d443b983-7a46-46fe-aa3d-dfd9c54cdf54"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:43:16.142270 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.142262 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-kube-api-access-sb9pf" (OuterVolumeSpecName: "kube-api-access-sb9pf") pod "d443b983-7a46-46fe-aa3d-dfd9c54cdf54" (UID: "d443b983-7a46-46fe-aa3d-dfd9c54cdf54"). InnerVolumeSpecName "kube-api-access-sb9pf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:43:16.241225 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.241186 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-oauth-config\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:43:16.241225 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.241216 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-config\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:43:16.241225 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.241229 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-console-serving-cert\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:43:16.241495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.241243 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-service-ca\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:43:16.241495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.241255 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sb9pf\" (UniqueName: \"kubernetes.io/projected/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-kube-api-access-sb9pf\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:43:16.241495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.241266 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-oauth-serving-cert\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:43:16.241495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.241277 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d443b983-7a46-46fe-aa3d-dfd9c54cdf54-trusted-ca-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:43:16.756946 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.756917 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c9597f7b8-nwszm_d443b983-7a46-46fe-aa3d-dfd9c54cdf54/console/0.log" Apr 21 04:43:16.757122 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.756959 2579 generic.go:358] "Generic (PLEG): container finished" podID="d443b983-7a46-46fe-aa3d-dfd9c54cdf54" containerID="0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147" exitCode=2 Apr 21 04:43:16.757122 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.757027 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c9597f7b8-nwszm" Apr 21 04:43:16.757122 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.757050 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c9597f7b8-nwszm" event={"ID":"d443b983-7a46-46fe-aa3d-dfd9c54cdf54","Type":"ContainerDied","Data":"0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147"} Apr 21 04:43:16.757122 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.757089 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c9597f7b8-nwszm" event={"ID":"d443b983-7a46-46fe-aa3d-dfd9c54cdf54","Type":"ContainerDied","Data":"8d35ad4ef94786ee3906af256f7b31173cf2422faf4215850bb28c2502ab3d98"} Apr 21 04:43:16.757122 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.757105 2579 scope.go:117] "RemoveContainer" containerID="0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147" Apr 21 04:43:16.765560 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.765404 2579 scope.go:117] "RemoveContainer" containerID="0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147" Apr 21 04:43:16.765786 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:43:16.765663 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147\": container with ID starting with 0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147 not found: ID does not exist" containerID="0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147" Apr 21 04:43:16.765786 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.765688 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147"} err="failed to get container status \"0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147\": rpc error: code = NotFound desc = could not find container \"0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147\": container with ID starting with 0f97b01be67761aebf8e772506dfd93b71ab516441afe7fd9e51bfdef1bc1147 not found: ID does not exist" Apr 21 04:43:16.780240 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.780217 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c9597f7b8-nwszm"] Apr 21 04:43:16.784346 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:16.784325 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c9597f7b8-nwszm"] Apr 21 04:43:17.896266 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:17.896188 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d443b983-7a46-46fe-aa3d-dfd9c54cdf54" path="/var/lib/kubelet/pods/d443b983-7a46-46fe-aa3d-dfd9c54cdf54/volumes" Apr 21 04:43:51.885377 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.885334 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh"] Apr 21 04:43:51.885893 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.885795 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d443b983-7a46-46fe-aa3d-dfd9c54cdf54" containerName="console" Apr 21 04:43:51.885893 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.885813 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d443b983-7a46-46fe-aa3d-dfd9c54cdf54" containerName="console" Apr 21 04:43:51.885893 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.885881 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d443b983-7a46-46fe-aa3d-dfd9c54cdf54" containerName="console" Apr 21 04:43:51.889081 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.889062 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:51.892258 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.892231 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:43:51.892258 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.892245 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:43:51.893551 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.893531 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sjrjx\"" Apr 21 04:43:51.898730 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.898702 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh"] Apr 21 04:43:51.934475 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.934440 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:51.934683 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.934483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:51.934683 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:51.934629 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dkts\" (UniqueName: \"kubernetes.io/projected/bafe9f83-2349-41d7-9e7c-1f3924ca6704-kube-api-access-7dkts\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:52.035696 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.035639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dkts\" (UniqueName: \"kubernetes.io/projected/bafe9f83-2349-41d7-9e7c-1f3924ca6704-kube-api-access-7dkts\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:52.035893 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.035723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:52.035893 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.035744 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:52.036072 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.036052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:52.036136 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.036078 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:52.046728 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.046706 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dkts\" (UniqueName: \"kubernetes.io/projected/bafe9f83-2349-41d7-9e7c-1f3924ca6704-kube-api-access-7dkts\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:52.199357 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.199271 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:43:52.330642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.330610 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh"] Apr 21 04:43:52.333105 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:43:52.333070 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbafe9f83_2349_41d7_9e7c_1f3924ca6704.slice/crio-bca5a4594bf5f33c2654103a28a7ada98f28f03fe883a47ffd708f00ef9fab4e WatchSource:0}: Error finding container bca5a4594bf5f33c2654103a28a7ada98f28f03fe883a47ffd708f00ef9fab4e: Status 404 returned error can't find the container with id bca5a4594bf5f33c2654103a28a7ada98f28f03fe883a47ffd708f00ef9fab4e Apr 21 04:43:52.862344 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:52.862296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" event={"ID":"bafe9f83-2349-41d7-9e7c-1f3924ca6704","Type":"ContainerStarted","Data":"bca5a4594bf5f33c2654103a28a7ada98f28f03fe883a47ffd708f00ef9fab4e"} Apr 21 04:43:57.772721 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:57.772637 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:43:57.775463 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:57.775437 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:43:57.776812 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:57.776796 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:43:57.878399 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:57.878207 2579 generic.go:358] "Generic (PLEG): container finished" podID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerID="b57c70650d59700183ffe57c0ab6399ab6729f9ff849b39ea26b602723fdc01a" exitCode=0 Apr 21 04:43:57.878399 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:43:57.878289 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" event={"ID":"bafe9f83-2349-41d7-9e7c-1f3924ca6704","Type":"ContainerDied","Data":"b57c70650d59700183ffe57c0ab6399ab6729f9ff849b39ea26b602723fdc01a"} Apr 21 04:44:01.893856 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:01.893820 2579 generic.go:358] "Generic (PLEG): container finished" podID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerID="0fc7bae03d2e8edf684b4a289427e8115c94582f2850fd3cc6841617e85fb632" exitCode=0 Apr 21 04:44:01.894923 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:01.894897 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:44:01.896253 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:01.896230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" event={"ID":"bafe9f83-2349-41d7-9e7c-1f3924ca6704","Type":"ContainerDied","Data":"0fc7bae03d2e8edf684b4a289427e8115c94582f2850fd3cc6841617e85fb632"} Apr 21 04:44:08.915509 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:08.915471 2579 generic.go:358] "Generic (PLEG): container finished" podID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerID="48055972876ccf5d4db7f59b96b3307227513fbc578289358bfca0867f6bba0c" exitCode=0 Apr 21 04:44:08.915878 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:08.915530 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" event={"ID":"bafe9f83-2349-41d7-9e7c-1f3924ca6704","Type":"ContainerDied","Data":"48055972876ccf5d4db7f59b96b3307227513fbc578289358bfca0867f6bba0c"} Apr 21 04:44:10.036691 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.036666 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:44:10.089516 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.089473 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dkts\" (UniqueName: \"kubernetes.io/projected/bafe9f83-2349-41d7-9e7c-1f3924ca6704-kube-api-access-7dkts\") pod \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " Apr 21 04:44:10.089516 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.089516 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-bundle\") pod \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " Apr 21 04:44:10.089766 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.089538 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-util\") pod \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\" (UID: \"bafe9f83-2349-41d7-9e7c-1f3924ca6704\") " Apr 21 04:44:10.090227 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.090199 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-bundle" (OuterVolumeSpecName: "bundle") pod "bafe9f83-2349-41d7-9e7c-1f3924ca6704" (UID: "bafe9f83-2349-41d7-9e7c-1f3924ca6704"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:10.091711 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.091683 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafe9f83-2349-41d7-9e7c-1f3924ca6704-kube-api-access-7dkts" (OuterVolumeSpecName: "kube-api-access-7dkts") pod "bafe9f83-2349-41d7-9e7c-1f3924ca6704" (UID: "bafe9f83-2349-41d7-9e7c-1f3924ca6704"). InnerVolumeSpecName "kube-api-access-7dkts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:44:10.094165 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.094137 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-util" (OuterVolumeSpecName: "util") pod "bafe9f83-2349-41d7-9e7c-1f3924ca6704" (UID: "bafe9f83-2349-41d7-9e7c-1f3924ca6704"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:10.190227 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.190141 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7dkts\" (UniqueName: \"kubernetes.io/projected/bafe9f83-2349-41d7-9e7c-1f3924ca6704-kube-api-access-7dkts\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:10.190227 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.190170 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:10.190227 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.190179 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bafe9f83-2349-41d7-9e7c-1f3924ca6704-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:10.923258 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.923227 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" Apr 21 04:44:10.923448 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.923223 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2g7dh" event={"ID":"bafe9f83-2349-41d7-9e7c-1f3924ca6704","Type":"ContainerDied","Data":"bca5a4594bf5f33c2654103a28a7ada98f28f03fe883a47ffd708f00ef9fab4e"} Apr 21 04:44:10.923448 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:10.923338 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bca5a4594bf5f33c2654103a28a7ada98f28f03fe883a47ffd708f00ef9fab4e" Apr 21 04:44:15.055115 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.055077 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk"] Apr 21 04:44:15.055510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.055390 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerName="extract" Apr 21 04:44:15.055510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.055402 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerName="extract" Apr 21 04:44:15.055510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.055411 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerName="util" Apr 21 04:44:15.055510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.055417 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerName="util" Apr 21 04:44:15.055510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.055434 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerName="pull" Apr 21 04:44:15.055510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.055439 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerName="pull" Apr 21 04:44:15.055510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.055481 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="bafe9f83-2349-41d7-9e7c-1f3924ca6704" containerName="extract" Apr 21 04:44:15.112450 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.112418 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk"] Apr 21 04:44:15.112628 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.112575 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" Apr 21 04:44:15.115656 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.115631 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:44:15.116465 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.116444 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 04:44:15.116820 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.116798 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-d974k\"" Apr 21 04:44:15.230443 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.230398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0ff1812-ca97-4fa5-86ea-46813b9dc539-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rs4rk\" (UID: \"c0ff1812-ca97-4fa5-86ea-46813b9dc539\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" Apr 21 04:44:15.230621 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.230520 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vz6v\" (UniqueName: \"kubernetes.io/projected/c0ff1812-ca97-4fa5-86ea-46813b9dc539-kube-api-access-9vz6v\") pod \"cert-manager-operator-controller-manager-54b9655956-rs4rk\" (UID: \"c0ff1812-ca97-4fa5-86ea-46813b9dc539\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" Apr 21 04:44:15.331551 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.331463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vz6v\" (UniqueName: \"kubernetes.io/projected/c0ff1812-ca97-4fa5-86ea-46813b9dc539-kube-api-access-9vz6v\") pod \"cert-manager-operator-controller-manager-54b9655956-rs4rk\" (UID: \"c0ff1812-ca97-4fa5-86ea-46813b9dc539\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" Apr 21 04:44:15.331551 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.331544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0ff1812-ca97-4fa5-86ea-46813b9dc539-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rs4rk\" (UID: \"c0ff1812-ca97-4fa5-86ea-46813b9dc539\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" Apr 21 04:44:15.331935 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.331917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0ff1812-ca97-4fa5-86ea-46813b9dc539-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rs4rk\" (UID: \"c0ff1812-ca97-4fa5-86ea-46813b9dc539\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" Apr 21 04:44:15.341905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.341875 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vz6v\" (UniqueName: \"kubernetes.io/projected/c0ff1812-ca97-4fa5-86ea-46813b9dc539-kube-api-access-9vz6v\") pod \"cert-manager-operator-controller-manager-54b9655956-rs4rk\" (UID: \"c0ff1812-ca97-4fa5-86ea-46813b9dc539\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" Apr 21 04:44:15.421986 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.421951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" Apr 21 04:44:15.552686 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.552658 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk"] Apr 21 04:44:15.555770 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:44:15.555737 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ff1812_ca97_4fa5_86ea_46813b9dc539.slice/crio-201f3d5391652839da590c15f43d5e29bed594285814af94ec09964d1fd725f7 WatchSource:0}: Error finding container 201f3d5391652839da590c15f43d5e29bed594285814af94ec09964d1fd725f7: Status 404 returned error can't find the container with id 201f3d5391652839da590c15f43d5e29bed594285814af94ec09964d1fd725f7 Apr 21 04:44:15.939531 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:15.939489 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" event={"ID":"c0ff1812-ca97-4fa5-86ea-46813b9dc539","Type":"ContainerStarted","Data":"201f3d5391652839da590c15f43d5e29bed594285814af94ec09964d1fd725f7"} Apr 21 04:44:17.947775 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:17.947730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" event={"ID":"c0ff1812-ca97-4fa5-86ea-46813b9dc539","Type":"ContainerStarted","Data":"97a2343a52286bc32ac0caed2f9364f1ad0ba51a47ba3b68f5613c10d06f4624"} Apr 21 04:44:17.986285 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:17.986219 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rs4rk" podStartSLOduration=1.081587679 podStartE2EDuration="2.986198948s" podCreationTimestamp="2026-04-21 04:44:15 +0000 UTC" firstStartedPulling="2026-04-21 04:44:15.558255907 +0000 UTC m=+318.184932563" lastFinishedPulling="2026-04-21 04:44:17.46286716 +0000 UTC m=+320.089543832" observedRunningTime="2026-04-21 04:44:17.981072341 +0000 UTC m=+320.607749018" watchObservedRunningTime="2026-04-21 04:44:17.986198948 +0000 UTC m=+320.612875630" Apr 21 04:44:19.214124 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.214077 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn"] Apr 21 04:44:19.232563 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.232522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.233872 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.233841 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn"] Apr 21 04:44:19.242576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.242549 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:44:19.242716 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.242615 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:44:19.243726 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.243702 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sjrjx\"" Apr 21 04:44:19.368873 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.368833 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.368873 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.368873 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.369077 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.368913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdpcf\" (UniqueName: \"kubernetes.io/projected/c766accc-c0e9-4f1b-8974-6d2332b01f3b-kube-api-access-jdpcf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.469628 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.469538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.469628 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.469574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.469628 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.469619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdpcf\" (UniqueName: \"kubernetes.io/projected/c766accc-c0e9-4f1b-8974-6d2332b01f3b-kube-api-access-jdpcf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.469998 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.469973 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.470035 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.469982 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.479937 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.479901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdpcf\" (UniqueName: \"kubernetes.io/projected/c766accc-c0e9-4f1b-8974-6d2332b01f3b-kube-api-access-jdpcf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.543466 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.543425 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:19.677947 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.677799 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn"] Apr 21 04:44:19.680452 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:44:19.680417 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc766accc_c0e9_4f1b_8974_6d2332b01f3b.slice/crio-87668446613bf694d3fcff36ef3763300bd6e6c257213193e48077f8ad619ad9 WatchSource:0}: Error finding container 87668446613bf694d3fcff36ef3763300bd6e6c257213193e48077f8ad619ad9: Status 404 returned error can't find the container with id 87668446613bf694d3fcff36ef3763300bd6e6c257213193e48077f8ad619ad9 Apr 21 04:44:19.955121 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.955083 2579 generic.go:358] "Generic (PLEG): container finished" podID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerID="54b58f02bb6db857a80ba7957e52063b6004271ded348698946bbf86de44633b" exitCode=0 Apr 21 04:44:19.955305 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.955175 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" event={"ID":"c766accc-c0e9-4f1b-8974-6d2332b01f3b","Type":"ContainerDied","Data":"54b58f02bb6db857a80ba7957e52063b6004271ded348698946bbf86de44633b"} Apr 21 04:44:19.955305 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:19.955216 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" event={"ID":"c766accc-c0e9-4f1b-8974-6d2332b01f3b","Type":"ContainerStarted","Data":"87668446613bf694d3fcff36ef3763300bd6e6c257213193e48077f8ad619ad9"} Apr 21 04:44:22.966096 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:22.966058 2579 generic.go:358] "Generic (PLEG): container finished" podID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerID="569e030d3685da13392a8728d70c559c16345350723b5c86289fab5db2877906" exitCode=0 Apr 21 04:44:22.966518 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:22.966133 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" event={"ID":"c766accc-c0e9-4f1b-8974-6d2332b01f3b","Type":"ContainerDied","Data":"569e030d3685da13392a8728d70c559c16345350723b5c86289fab5db2877906"} Apr 21 04:44:23.947389 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:23.947339 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-bmxjx"] Apr 21 04:44:23.950642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:23.950613 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" Apr 21 04:44:23.953398 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:23.953379 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 04:44:23.953494 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:23.953409 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 04:44:23.954565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:23.954549 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-68m4z\"" Apr 21 04:44:23.963292 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:23.963265 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-bmxjx"] Apr 21 04:44:23.971940 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:23.971912 2579 generic.go:358] "Generic (PLEG): container finished" podID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerID="ababc38937eb167ed352db66040a60967644c4bf304b01e72b6c476a34b96a43" exitCode=0 Apr 21 04:44:23.972287 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:23.971997 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" event={"ID":"c766accc-c0e9-4f1b-8974-6d2332b01f3b","Type":"ContainerDied","Data":"ababc38937eb167ed352db66040a60967644c4bf304b01e72b6c476a34b96a43"} Apr 21 04:44:24.112737 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.112695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbb9279b-9f74-4d45-b642-d04c16db316b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-bmxjx\" (UID: \"fbb9279b-9f74-4d45-b642-d04c16db316b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" Apr 21 04:44:24.112926 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.112771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrdr\" (UniqueName: \"kubernetes.io/projected/fbb9279b-9f74-4d45-b642-d04c16db316b-kube-api-access-2mrdr\") pod \"cert-manager-cainjector-68b757865b-bmxjx\" (UID: \"fbb9279b-9f74-4d45-b642-d04c16db316b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" Apr 21 04:44:24.214318 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.214232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbb9279b-9f74-4d45-b642-d04c16db316b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-bmxjx\" (UID: \"fbb9279b-9f74-4d45-b642-d04c16db316b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" Apr 21 04:44:24.214318 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.214289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrdr\" (UniqueName: \"kubernetes.io/projected/fbb9279b-9f74-4d45-b642-d04c16db316b-kube-api-access-2mrdr\") pod \"cert-manager-cainjector-68b757865b-bmxjx\" (UID: \"fbb9279b-9f74-4d45-b642-d04c16db316b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" Apr 21 04:44:24.227122 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.227093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbb9279b-9f74-4d45-b642-d04c16db316b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-bmxjx\" (UID: \"fbb9279b-9f74-4d45-b642-d04c16db316b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" Apr 21 04:44:24.231465 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.231436 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrdr\" (UniqueName: \"kubernetes.io/projected/fbb9279b-9f74-4d45-b642-d04c16db316b-kube-api-access-2mrdr\") pod \"cert-manager-cainjector-68b757865b-bmxjx\" (UID: \"fbb9279b-9f74-4d45-b642-d04c16db316b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" Apr 21 04:44:24.272702 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.272665 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" Apr 21 04:44:24.404239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.404196 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-bmxjx"] Apr 21 04:44:24.411216 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:44:24.411185 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbb9279b_9f74_4d45_b642_d04c16db316b.slice/crio-d94575798698129d20e9498accf9c7456db62330a5d618fb26ca0bcd8ade3487 WatchSource:0}: Error finding container d94575798698129d20e9498accf9c7456db62330a5d618fb26ca0bcd8ade3487: Status 404 returned error can't find the container with id d94575798698129d20e9498accf9c7456db62330a5d618fb26ca0bcd8ade3487 Apr 21 04:44:24.978670 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:24.978627 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" event={"ID":"fbb9279b-9f74-4d45-b642-d04c16db316b","Type":"ContainerStarted","Data":"d94575798698129d20e9498accf9c7456db62330a5d618fb26ca0bcd8ade3487"} Apr 21 04:44:25.091116 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.091094 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:25.224231 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.224180 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdpcf\" (UniqueName: \"kubernetes.io/projected/c766accc-c0e9-4f1b-8974-6d2332b01f3b-kube-api-access-jdpcf\") pod \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " Apr 21 04:44:25.224419 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.224291 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-bundle\") pod \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " Apr 21 04:44:25.224419 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.224322 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-util\") pod \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\" (UID: \"c766accc-c0e9-4f1b-8974-6d2332b01f3b\") " Apr 21 04:44:25.224758 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.224729 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-bundle" (OuterVolumeSpecName: "bundle") pod "c766accc-c0e9-4f1b-8974-6d2332b01f3b" (UID: "c766accc-c0e9-4f1b-8974-6d2332b01f3b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:25.226762 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.226730 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c766accc-c0e9-4f1b-8974-6d2332b01f3b-kube-api-access-jdpcf" (OuterVolumeSpecName: "kube-api-access-jdpcf") pod "c766accc-c0e9-4f1b-8974-6d2332b01f3b" (UID: "c766accc-c0e9-4f1b-8974-6d2332b01f3b"). InnerVolumeSpecName "kube-api-access-jdpcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:44:25.287814 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.287773 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-util" (OuterVolumeSpecName: "util") pod "c766accc-c0e9-4f1b-8974-6d2332b01f3b" (UID: "c766accc-c0e9-4f1b-8974-6d2332b01f3b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:25.325499 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.325465 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jdpcf\" (UniqueName: \"kubernetes.io/projected/c766accc-c0e9-4f1b-8974-6d2332b01f3b-kube-api-access-jdpcf\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:25.325499 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.325496 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:25.325499 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.325508 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c766accc-c0e9-4f1b-8974-6d2332b01f3b-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:25.983962 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.983909 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" event={"ID":"c766accc-c0e9-4f1b-8974-6d2332b01f3b","Type":"ContainerDied","Data":"87668446613bf694d3fcff36ef3763300bd6e6c257213193e48077f8ad619ad9"} Apr 21 04:44:25.983962 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.983949 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87668446613bf694d3fcff36ef3763300bd6e6c257213193e48077f8ad619ad9" Apr 21 04:44:25.983962 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:25.983951 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f689jn" Apr 21 04:44:27.993060 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:27.993022 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" event={"ID":"fbb9279b-9f74-4d45-b642-d04c16db316b","Type":"ContainerStarted","Data":"d6e2021e3fec12eca31640182b18391a51ef0496589554091b4c577174beb99f"} Apr 21 04:44:28.013641 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:28.013587 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-bmxjx" podStartSLOduration=2.497709161 podStartE2EDuration="5.013572005s" podCreationTimestamp="2026-04-21 04:44:23 +0000 UTC" firstStartedPulling="2026-04-21 04:44:24.413346047 +0000 UTC m=+327.040022703" lastFinishedPulling="2026-04-21 04:44:26.929208887 +0000 UTC m=+329.555885547" observedRunningTime="2026-04-21 04:44:28.012078087 +0000 UTC m=+330.638754765" watchObservedRunningTime="2026-04-21 04:44:28.013572005 +0000 UTC m=+330.640248683" Apr 21 04:44:31.869140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.869100 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6"] Apr 21 04:44:31.869565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.869423 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerName="util" Apr 21 04:44:31.869565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.869434 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerName="util" Apr 21 04:44:31.869565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.869452 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerName="pull" Apr 21 04:44:31.869565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.869457 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerName="pull" Apr 21 04:44:31.869565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.869469 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerName="extract" Apr 21 04:44:31.869565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.869475 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerName="extract" Apr 21 04:44:31.869565 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.869529 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c766accc-c0e9-4f1b-8974-6d2332b01f3b" containerName="extract" Apr 21 04:44:31.872510 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.872493 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" Apr 21 04:44:31.875171 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.875153 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 04:44:31.876197 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.876166 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:44:31.876317 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.876168 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-vdqkz\"" Apr 21 04:44:31.882041 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.882020 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6"] Apr 21 04:44:31.984948 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.984903 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzll\" (UniqueName: \"kubernetes.io/projected/fd580cbc-c1d8-40b7-8b5e-701fe02ac604-kube-api-access-ltzll\") pod \"openshift-lws-operator-bfc7f696d-28rj6\" (UID: \"fd580cbc-c1d8-40b7-8b5e-701fe02ac604\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" Apr 21 04:44:31.985117 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:31.985008 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd580cbc-c1d8-40b7-8b5e-701fe02ac604-tmp\") pod \"openshift-lws-operator-bfc7f696d-28rj6\" (UID: \"fd580cbc-c1d8-40b7-8b5e-701fe02ac604\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" Apr 21 04:44:32.085699 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.085666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzll\" (UniqueName: \"kubernetes.io/projected/fd580cbc-c1d8-40b7-8b5e-701fe02ac604-kube-api-access-ltzll\") pod \"openshift-lws-operator-bfc7f696d-28rj6\" (UID: \"fd580cbc-c1d8-40b7-8b5e-701fe02ac604\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" Apr 21 04:44:32.085914 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.085751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd580cbc-c1d8-40b7-8b5e-701fe02ac604-tmp\") pod \"openshift-lws-operator-bfc7f696d-28rj6\" (UID: \"fd580cbc-c1d8-40b7-8b5e-701fe02ac604\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" Apr 21 04:44:32.086134 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.086114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd580cbc-c1d8-40b7-8b5e-701fe02ac604-tmp\") pod \"openshift-lws-operator-bfc7f696d-28rj6\" (UID: \"fd580cbc-c1d8-40b7-8b5e-701fe02ac604\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" Apr 21 04:44:32.094816 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.094788 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzll\" (UniqueName: \"kubernetes.io/projected/fd580cbc-c1d8-40b7-8b5e-701fe02ac604-kube-api-access-ltzll\") pod \"openshift-lws-operator-bfc7f696d-28rj6\" (UID: \"fd580cbc-c1d8-40b7-8b5e-701fe02ac604\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" Apr 21 04:44:32.182316 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.182215 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" Apr 21 04:44:32.316345 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.316315 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6"] Apr 21 04:44:32.318937 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:44:32.318911 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd580cbc_c1d8_40b7_8b5e_701fe02ac604.slice/crio-4416a0aec8868d6c77dc1705a6a89be2cf85dcb1fb5687c41bf641828c7e432e WatchSource:0}: Error finding container 4416a0aec8868d6c77dc1705a6a89be2cf85dcb1fb5687c41bf641828c7e432e: Status 404 returned error can't find the container with id 4416a0aec8868d6c77dc1705a6a89be2cf85dcb1fb5687c41bf641828c7e432e Apr 21 04:44:32.446038 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.445958 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-xmjdj"] Apr 21 04:44:32.450515 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.450498 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-xmjdj" Apr 21 04:44:32.452975 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.452953 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-dqc8c\"" Apr 21 04:44:32.460247 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.460224 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-xmjdj"] Apr 21 04:44:32.590969 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.590933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b-bound-sa-token\") pod \"cert-manager-79c8d999ff-xmjdj\" (UID: \"7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b\") " pod="cert-manager/cert-manager-79c8d999ff-xmjdj" Apr 21 04:44:32.590969 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.590981 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfzds\" (UniqueName: \"kubernetes.io/projected/7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b-kube-api-access-sfzds\") pod \"cert-manager-79c8d999ff-xmjdj\" (UID: \"7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b\") " pod="cert-manager/cert-manager-79c8d999ff-xmjdj" Apr 21 04:44:32.692140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.692103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b-bound-sa-token\") pod \"cert-manager-79c8d999ff-xmjdj\" (UID: \"7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b\") " pod="cert-manager/cert-manager-79c8d999ff-xmjdj" Apr 21 04:44:32.692140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.692146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfzds\" (UniqueName: \"kubernetes.io/projected/7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b-kube-api-access-sfzds\") pod \"cert-manager-79c8d999ff-xmjdj\" (UID: \"7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b\") " pod="cert-manager/cert-manager-79c8d999ff-xmjdj" Apr 21 04:44:32.701081 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.701014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b-bound-sa-token\") pod \"cert-manager-79c8d999ff-xmjdj\" (UID: \"7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b\") " pod="cert-manager/cert-manager-79c8d999ff-xmjdj" Apr 21 04:44:32.701081 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.701060 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfzds\" (UniqueName: \"kubernetes.io/projected/7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b-kube-api-access-sfzds\") pod \"cert-manager-79c8d999ff-xmjdj\" (UID: \"7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b\") " pod="cert-manager/cert-manager-79c8d999ff-xmjdj" Apr 21 04:44:32.762208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.762169 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-xmjdj" Apr 21 04:44:32.882374 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:32.882330 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-xmjdj"] Apr 21 04:44:32.889671 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:44:32.889644 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b6c8b72_c28e_4ea5_a70f_f6ddc2c56f8b.slice/crio-8a72d3402d854c075a66d87c9d25ae2a76bcc20bf02e293ab1ba23f84a933978 WatchSource:0}: Error finding container 8a72d3402d854c075a66d87c9d25ae2a76bcc20bf02e293ab1ba23f84a933978: Status 404 returned error can't find the container with id 8a72d3402d854c075a66d87c9d25ae2a76bcc20bf02e293ab1ba23f84a933978 Apr 21 04:44:33.013636 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:33.013540 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-xmjdj" event={"ID":"7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b","Type":"ContainerStarted","Data":"29f5ae20ed263f05bd5f9358557d454a4c26013548ef14c2a0bf1828a7641265"} Apr 21 04:44:33.013636 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:33.013578 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-xmjdj" event={"ID":"7b6c8b72-c28e-4ea5-a70f-f6ddc2c56f8b","Type":"ContainerStarted","Data":"8a72d3402d854c075a66d87c9d25ae2a76bcc20bf02e293ab1ba23f84a933978"} Apr 21 04:44:33.014746 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:33.014723 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" event={"ID":"fd580cbc-c1d8-40b7-8b5e-701fe02ac604","Type":"ContainerStarted","Data":"4416a0aec8868d6c77dc1705a6a89be2cf85dcb1fb5687c41bf641828c7e432e"} Apr 21 04:44:33.030930 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:33.030883 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-xmjdj" podStartSLOduration=1.030868386 podStartE2EDuration="1.030868386s" podCreationTimestamp="2026-04-21 04:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:44:33.029266868 +0000 UTC m=+335.655943548" watchObservedRunningTime="2026-04-21 04:44:33.030868386 +0000 UTC m=+335.657545064" Apr 21 04:44:35.023378 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:35.023322 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" event={"ID":"fd580cbc-c1d8-40b7-8b5e-701fe02ac604","Type":"ContainerStarted","Data":"27f1d181158fcbc5b00a4261ef99aec662bee0b3a5ed49e2c3e3c63b8778c493"} Apr 21 04:44:35.042989 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:35.042935 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-28rj6" podStartSLOduration=1.531546434 podStartE2EDuration="4.04292041s" podCreationTimestamp="2026-04-21 04:44:31 +0000 UTC" firstStartedPulling="2026-04-21 04:44:32.320423153 +0000 UTC m=+334.947099810" lastFinishedPulling="2026-04-21 04:44:34.831797129 +0000 UTC m=+337.458473786" observedRunningTime="2026-04-21 04:44:35.041547233 +0000 UTC m=+337.668223912" watchObservedRunningTime="2026-04-21 04:44:35.04292041 +0000 UTC m=+337.669597088" Apr 21 04:44:39.579505 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.579468 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t"] Apr 21 04:44:39.584478 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.584445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.587863 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.587835 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:44:39.587998 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.587841 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sjrjx\"" Apr 21 04:44:39.588834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.588814 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:44:39.591414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.591350 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t"] Apr 21 04:44:39.641196 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.641154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdp4\" (UniqueName: \"kubernetes.io/projected/27f36f5b-4f18-4642-8076-cb3aa2e36782-kube-api-access-xkdp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.641196 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.641196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.641436 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.641222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.742282 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.742246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.742467 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.742295 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.742467 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.742385 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdp4\" (UniqueName: \"kubernetes.io/projected/27f36f5b-4f18-4642-8076-cb3aa2e36782-kube-api-access-xkdp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.742676 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.742651 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.742717 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.742684 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.751327 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.751303 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdp4\" (UniqueName: \"kubernetes.io/projected/27f36f5b-4f18-4642-8076-cb3aa2e36782-kube-api-access-xkdp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:39.895248 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:39.895164 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:40.018958 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:40.018876 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t"] Apr 21 04:44:40.022890 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:44:40.022855 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27f36f5b_4f18_4642_8076_cb3aa2e36782.slice/crio-dcded8061a04086fcd5d9976c1285cd261aee4306f6fe0ca6ec0dbe58b21f659 WatchSource:0}: Error finding container dcded8061a04086fcd5d9976c1285cd261aee4306f6fe0ca6ec0dbe58b21f659: Status 404 returned error can't find the container with id dcded8061a04086fcd5d9976c1285cd261aee4306f6fe0ca6ec0dbe58b21f659 Apr 21 04:44:40.043176 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:40.043148 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" event={"ID":"27f36f5b-4f18-4642-8076-cb3aa2e36782","Type":"ContainerStarted","Data":"dcded8061a04086fcd5d9976c1285cd261aee4306f6fe0ca6ec0dbe58b21f659"} Apr 21 04:44:41.048134 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:41.048102 2579 generic.go:358] "Generic (PLEG): container finished" podID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerID="5727415c7a5528992a831780865bb10aa64c60e3571b1b8409f7987b19eda1d0" exitCode=0 Apr 21 04:44:41.048638 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:41.048191 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" event={"ID":"27f36f5b-4f18-4642-8076-cb3aa2e36782","Type":"ContainerDied","Data":"5727415c7a5528992a831780865bb10aa64c60e3571b1b8409f7987b19eda1d0"} Apr 21 04:44:42.053387 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:42.053328 2579 generic.go:358] "Generic (PLEG): container finished" podID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerID="3a87d070d7fe555cfeb5077da2c6cfc8c801d878e2eb81cd3850b3be2b7406f1" exitCode=0 Apr 21 04:44:42.053781 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:42.053395 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" event={"ID":"27f36f5b-4f18-4642-8076-cb3aa2e36782","Type":"ContainerDied","Data":"3a87d070d7fe555cfeb5077da2c6cfc8c801d878e2eb81cd3850b3be2b7406f1"} Apr 21 04:44:43.058874 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:43.058839 2579 generic.go:358] "Generic (PLEG): container finished" podID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerID="dda7dc9fc1b74a014cef3bf81984b1be2ead87bf646d94cf35cea50358320366" exitCode=0 Apr 21 04:44:43.058874 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:43.058879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" event={"ID":"27f36f5b-4f18-4642-8076-cb3aa2e36782","Type":"ContainerDied","Data":"dda7dc9fc1b74a014cef3bf81984b1be2ead87bf646d94cf35cea50358320366"} Apr 21 04:44:44.185141 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.185115 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:44.276064 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.276020 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-bundle\") pod \"27f36f5b-4f18-4642-8076-cb3aa2e36782\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " Apr 21 04:44:44.276064 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.276066 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkdp4\" (UniqueName: \"kubernetes.io/projected/27f36f5b-4f18-4642-8076-cb3aa2e36782-kube-api-access-xkdp4\") pod \"27f36f5b-4f18-4642-8076-cb3aa2e36782\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " Apr 21 04:44:44.276325 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.276097 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-util\") pod \"27f36f5b-4f18-4642-8076-cb3aa2e36782\" (UID: \"27f36f5b-4f18-4642-8076-cb3aa2e36782\") " Apr 21 04:44:44.276764 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.276736 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-bundle" (OuterVolumeSpecName: "bundle") pod "27f36f5b-4f18-4642-8076-cb3aa2e36782" (UID: "27f36f5b-4f18-4642-8076-cb3aa2e36782"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:44.278208 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.278181 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f36f5b-4f18-4642-8076-cb3aa2e36782-kube-api-access-xkdp4" (OuterVolumeSpecName: "kube-api-access-xkdp4") pod "27f36f5b-4f18-4642-8076-cb3aa2e36782" (UID: "27f36f5b-4f18-4642-8076-cb3aa2e36782"). InnerVolumeSpecName "kube-api-access-xkdp4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:44:44.281429 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.281404 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-util" (OuterVolumeSpecName: "util") pod "27f36f5b-4f18-4642-8076-cb3aa2e36782" (UID: "27f36f5b-4f18-4642-8076-cb3aa2e36782"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:44.376859 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.376772 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:44.376859 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.376804 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xkdp4\" (UniqueName: \"kubernetes.io/projected/27f36f5b-4f18-4642-8076-cb3aa2e36782-kube-api-access-xkdp4\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:44.376859 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:44.376816 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27f36f5b-4f18-4642-8076-cb3aa2e36782-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:45.066340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:45.066306 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" event={"ID":"27f36f5b-4f18-4642-8076-cb3aa2e36782","Type":"ContainerDied","Data":"dcded8061a04086fcd5d9976c1285cd261aee4306f6fe0ca6ec0dbe58b21f659"} Apr 21 04:44:45.066340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:45.066340 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcded8061a04086fcd5d9976c1285cd261aee4306f6fe0ca6ec0dbe58b21f659" Apr 21 04:44:45.066553 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:45.066344 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c55zs7t" Apr 21 04:44:48.987777 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.987695 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs"] Apr 21 04:44:48.988153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.988036 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerName="pull" Apr 21 04:44:48.988153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.988048 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerName="pull" Apr 21 04:44:48.988153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.988060 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerName="util" Apr 21 04:44:48.988153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.988065 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerName="util" Apr 21 04:44:48.988153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.988074 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerName="extract" Apr 21 04:44:48.988153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.988080 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerName="extract" Apr 21 04:44:48.988153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.988142 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="27f36f5b-4f18-4642-8076-cb3aa2e36782" containerName="extract" Apr 21 04:44:48.991237 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.991221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:48.993927 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.993908 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:44:48.994900 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.994882 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sjrjx\"" Apr 21 04:44:48.994975 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.994885 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:44:48.999882 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:48.999855 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs"] Apr 21 04:44:49.009847 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.009815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.010046 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.009884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnsj4\" (UniqueName: \"kubernetes.io/projected/d25df00d-0274-44ef-ace6-c8608a0abfc7-kube-api-access-dnsj4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.010046 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.009986 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.110685 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.110652 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.110885 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.110704 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.110885 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.110841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnsj4\" (UniqueName: \"kubernetes.io/projected/d25df00d-0274-44ef-ace6-c8608a0abfc7-kube-api-access-dnsj4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.111066 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.111050 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.111111 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.111074 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.121387 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.121330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnsj4\" (UniqueName: \"kubernetes.io/projected/d25df00d-0274-44ef-ace6-c8608a0abfc7-kube-api-access-dnsj4\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.300832 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.300795 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:49.432500 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:49.432467 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs"] Apr 21 04:44:49.433838 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:44:49.433811 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd25df00d_0274_44ef_ace6_c8608a0abfc7.slice/crio-0744db8005ded2e5ff5264931882dd92818611e6a34333dce79709741f9323f5 WatchSource:0}: Error finding container 0744db8005ded2e5ff5264931882dd92818611e6a34333dce79709741f9323f5: Status 404 returned error can't find the container with id 0744db8005ded2e5ff5264931882dd92818611e6a34333dce79709741f9323f5 Apr 21 04:44:50.084552 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:50.084514 2579 generic.go:358] "Generic (PLEG): container finished" podID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerID="d16a49eb2097e1ce1162bbc2568e9b8947ebab6115e5941e01fefbab842d1ba4" exitCode=0 Apr 21 04:44:50.084921 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:50.084561 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" event={"ID":"d25df00d-0274-44ef-ace6-c8608a0abfc7","Type":"ContainerDied","Data":"d16a49eb2097e1ce1162bbc2568e9b8947ebab6115e5941e01fefbab842d1ba4"} Apr 21 04:44:50.084921 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:50.084584 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" event={"ID":"d25df00d-0274-44ef-ace6-c8608a0abfc7","Type":"ContainerStarted","Data":"0744db8005ded2e5ff5264931882dd92818611e6a34333dce79709741f9323f5"} Apr 21 04:44:51.092271 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.092234 2579 generic.go:358] "Generic (PLEG): container finished" podID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerID="89e3b8df5f78c79edf8059fc3733169812db757bf8ad446001affdadfbb800cd" exitCode=0 Apr 21 04:44:51.092677 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.092326 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" event={"ID":"d25df00d-0274-44ef-ace6-c8608a0abfc7","Type":"ContainerDied","Data":"89e3b8df5f78c79edf8059fc3733169812db757bf8ad446001affdadfbb800cd"} Apr 21 04:44:51.106533 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.106507 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6"] Apr 21 04:44:51.111044 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.111023 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.113497 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.113474 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 04:44:51.113751 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.113732 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 04:44:51.113925 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.113906 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9fpwh\"" Apr 21 04:44:51.114041 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.113989 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 04:44:51.114041 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.113996 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 04:44:51.123198 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.123175 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6"] Apr 21 04:44:51.130294 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.130267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/caf15f45-4d35-43a2-af97-c5203c5e3bc5-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.130441 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.130417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/caf15f45-4d35-43a2-af97-c5203c5e3bc5-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.130504 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.130482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6w2d\" (UniqueName: \"kubernetes.io/projected/caf15f45-4d35-43a2-af97-c5203c5e3bc5-kube-api-access-k6w2d\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.231833 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.231799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/caf15f45-4d35-43a2-af97-c5203c5e3bc5-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.231997 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.231852 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6w2d\" (UniqueName: \"kubernetes.io/projected/caf15f45-4d35-43a2-af97-c5203c5e3bc5-kube-api-access-k6w2d\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.231997 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.231907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/caf15f45-4d35-43a2-af97-c5203c5e3bc5-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.234476 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.234447 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/caf15f45-4d35-43a2-af97-c5203c5e3bc5-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.234571 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.234450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/caf15f45-4d35-43a2-af97-c5203c5e3bc5-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.248945 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.248917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6w2d\" (UniqueName: \"kubernetes.io/projected/caf15f45-4d35-43a2-af97-c5203c5e3bc5-kube-api-access-k6w2d\") pod \"opendatahub-operator-controller-manager-55ddb68486-mlwd6\" (UID: \"caf15f45-4d35-43a2-af97-c5203c5e3bc5\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.433508 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.433407 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:51.561537 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:51.561508 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6"] Apr 21 04:44:51.564176 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:44:51.564148 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaf15f45_4d35_43a2_af97_c5203c5e3bc5.slice/crio-4a4ca9ad4342c4eb3f6e47b168c8347405c356ac0710b5aa5b99cf05004720a4 WatchSource:0}: Error finding container 4a4ca9ad4342c4eb3f6e47b168c8347405c356ac0710b5aa5b99cf05004720a4: Status 404 returned error can't find the container with id 4a4ca9ad4342c4eb3f6e47b168c8347405c356ac0710b5aa5b99cf05004720a4 Apr 21 04:44:52.099939 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:52.099903 2579 generic.go:358] "Generic (PLEG): container finished" podID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerID="15d80d3c2f16459e5c7cd17d34ffc1bc6dbb3096af42e7863e4a0a4f8398a606" exitCode=0 Apr 21 04:44:52.100471 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:52.100000 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" event={"ID":"d25df00d-0274-44ef-ace6-c8608a0abfc7","Type":"ContainerDied","Data":"15d80d3c2f16459e5c7cd17d34ffc1bc6dbb3096af42e7863e4a0a4f8398a606"} Apr 21 04:44:52.101585 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:52.101554 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" event={"ID":"caf15f45-4d35-43a2-af97-c5203c5e3bc5","Type":"ContainerStarted","Data":"4a4ca9ad4342c4eb3f6e47b168c8347405c356ac0710b5aa5b99cf05004720a4"} Apr 21 04:44:53.611335 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.611301 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:53.651970 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.651925 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-bundle\") pod \"d25df00d-0274-44ef-ace6-c8608a0abfc7\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " Apr 21 04:44:53.652136 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.652022 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnsj4\" (UniqueName: \"kubernetes.io/projected/d25df00d-0274-44ef-ace6-c8608a0abfc7-kube-api-access-dnsj4\") pod \"d25df00d-0274-44ef-ace6-c8608a0abfc7\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " Apr 21 04:44:53.652136 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.652105 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-util\") pod \"d25df00d-0274-44ef-ace6-c8608a0abfc7\" (UID: \"d25df00d-0274-44ef-ace6-c8608a0abfc7\") " Apr 21 04:44:53.652915 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.652883 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-bundle" (OuterVolumeSpecName: "bundle") pod "d25df00d-0274-44ef-ace6-c8608a0abfc7" (UID: "d25df00d-0274-44ef-ace6-c8608a0abfc7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:53.654785 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.654748 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d25df00d-0274-44ef-ace6-c8608a0abfc7-kube-api-access-dnsj4" (OuterVolumeSpecName: "kube-api-access-dnsj4") pod "d25df00d-0274-44ef-ace6-c8608a0abfc7" (UID: "d25df00d-0274-44ef-ace6-c8608a0abfc7"). InnerVolumeSpecName "kube-api-access-dnsj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:44:53.661195 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.661164 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-util" (OuterVolumeSpecName: "util") pod "d25df00d-0274-44ef-ace6-c8608a0abfc7" (UID: "d25df00d-0274-44ef-ace6-c8608a0abfc7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:44:53.753023 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.752987 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dnsj4\" (UniqueName: \"kubernetes.io/projected/d25df00d-0274-44ef-ace6-c8608a0abfc7-kube-api-access-dnsj4\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:53.753023 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.753018 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:53.753023 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:53.753028 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d25df00d-0274-44ef-ace6-c8608a0abfc7-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:44:54.112285 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:54.112239 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" event={"ID":"d25df00d-0274-44ef-ace6-c8608a0abfc7","Type":"ContainerDied","Data":"0744db8005ded2e5ff5264931882dd92818611e6a34333dce79709741f9323f5"} Apr 21 04:44:54.112285 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:54.112284 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9bbwgs" Apr 21 04:44:54.112536 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:54.112292 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0744db8005ded2e5ff5264931882dd92818611e6a34333dce79709741f9323f5" Apr 21 04:44:55.117267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:55.117234 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" event={"ID":"caf15f45-4d35-43a2-af97-c5203c5e3bc5","Type":"ContainerStarted","Data":"c594cf91b5cac5500a2d4c95fa515602086bdee319205939a6bc1638fbc62a4b"} Apr 21 04:44:55.117756 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:55.117380 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:44:55.139716 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:44:55.139662 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" podStartSLOduration=1.4499728539999999 podStartE2EDuration="4.139648672s" podCreationTimestamp="2026-04-21 04:44:51 +0000 UTC" firstStartedPulling="2026-04-21 04:44:51.56589341 +0000 UTC m=+354.192570066" lastFinishedPulling="2026-04-21 04:44:54.255569228 +0000 UTC m=+356.882245884" observedRunningTime="2026-04-21 04:44:55.136439312 +0000 UTC m=+357.763115989" watchObservedRunningTime="2026-04-21 04:44:55.139648672 +0000 UTC m=+357.766325350" Apr 21 04:45:06.122765 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:06.122732 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-mlwd6" Apr 21 04:45:08.618037 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.617999 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv"] Apr 21 04:45:08.618428 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.618319 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerName="extract" Apr 21 04:45:08.618428 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.618330 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerName="extract" Apr 21 04:45:08.618428 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.618341 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerName="pull" Apr 21 04:45:08.618428 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.618346 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerName="pull" Apr 21 04:45:08.618428 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.618402 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerName="util" Apr 21 04:45:08.618428 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.618408 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerName="util" Apr 21 04:45:08.618618 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.618461 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d25df00d-0274-44ef-ace6-c8608a0abfc7" containerName="extract" Apr 21 04:45:08.621012 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.620991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.624507 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.624485 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:45:08.624624 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.624549 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sjrjx\"" Apr 21 04:45:08.625481 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.625464 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:45:08.635338 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.635313 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv"] Apr 21 04:45:08.681704 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.681645 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.681892 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.681772 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q5dq\" (UniqueName: \"kubernetes.io/projected/e2aad853-03e4-40bb-9021-807986fc82b9-kube-api-access-5q5dq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.681892 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.681809 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.782323 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.782265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q5dq\" (UniqueName: \"kubernetes.io/projected/e2aad853-03e4-40bb-9021-807986fc82b9-kube-api-access-5q5dq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.782323 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.782331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.782598 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.782447 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.782751 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.782731 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.782821 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.782802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.791839 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.791814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q5dq\" (UniqueName: \"kubernetes.io/projected/e2aad853-03e4-40bb-9021-807986fc82b9-kube-api-access-5q5dq\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:08.930629 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:08.930529 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:09.063109 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.063068 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv"] Apr 21 04:45:09.064341 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:45:09.064312 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2aad853_03e4_40bb_9021_807986fc82b9.slice/crio-70f727d1295d8c38e26b8616f695637f49c33597b42430a3960d91fd7f1c1b4d WatchSource:0}: Error finding container 70f727d1295d8c38e26b8616f695637f49c33597b42430a3960d91fd7f1c1b4d: Status 404 returned error can't find the container with id 70f727d1295d8c38e26b8616f695637f49c33597b42430a3960d91fd7f1c1b4d Apr 21 04:45:09.168742 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.168709 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2aad853-03e4-40bb-9021-807986fc82b9" containerID="ee43c35f6177acf381d0d48b0b739726b668aeabfdb68e9b69a22b8ca45eea92" exitCode=0 Apr 21 04:45:09.168938 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.168786 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" event={"ID":"e2aad853-03e4-40bb-9021-807986fc82b9","Type":"ContainerDied","Data":"ee43c35f6177acf381d0d48b0b739726b668aeabfdb68e9b69a22b8ca45eea92"} Apr 21 04:45:09.168938 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.168823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" event={"ID":"e2aad853-03e4-40bb-9021-807986fc82b9","Type":"ContainerStarted","Data":"70f727d1295d8c38e26b8616f695637f49c33597b42430a3960d91fd7f1c1b4d"} Apr 21 04:45:09.799671 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.799637 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j"] Apr 21 04:45:09.801869 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.801846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:09.805774 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.805750 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 04:45:09.805924 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.805789 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 04:45:09.805924 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.805840 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-6kdpx\"" Apr 21 04:45:09.805924 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.805790 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 04:45:09.805924 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.805870 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 04:45:09.815237 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.815214 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j"] Apr 21 04:45:09.892528 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.892491 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98930cc8-217b-42db-b1f2-816573cc740a-tmp\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:09.892641 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.892546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/98930cc8-217b-42db-b1f2-816573cc740a-tls-certs\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:09.892708 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.892634 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmsmf\" (UniqueName: \"kubernetes.io/projected/98930cc8-217b-42db-b1f2-816573cc740a-kube-api-access-kmsmf\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:09.993191 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.993163 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98930cc8-217b-42db-b1f2-816573cc740a-tmp\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:09.993353 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.993212 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/98930cc8-217b-42db-b1f2-816573cc740a-tls-certs\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:09.993353 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.993243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmsmf\" (UniqueName: \"kubernetes.io/projected/98930cc8-217b-42db-b1f2-816573cc740a-kube-api-access-kmsmf\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:09.995479 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.995455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98930cc8-217b-42db-b1f2-816573cc740a-tmp\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:09.995611 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:09.995592 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/98930cc8-217b-42db-b1f2-816573cc740a-tls-certs\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:10.001582 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:10.001556 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmsmf\" (UniqueName: \"kubernetes.io/projected/98930cc8-217b-42db-b1f2-816573cc740a-kube-api-access-kmsmf\") pod \"kube-auth-proxy-788fdfdbbd-6df9j\" (UID: \"98930cc8-217b-42db-b1f2-816573cc740a\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:10.112167 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:10.112131 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" Apr 21 04:45:10.174450 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:10.174358 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2aad853-03e4-40bb-9021-807986fc82b9" containerID="31a3893a0d5e8f87fee9d4718ac2c3d1025076b2df151bf341fd05039f3ea47d" exitCode=0 Apr 21 04:45:10.174450 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:10.174397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" event={"ID":"e2aad853-03e4-40bb-9021-807986fc82b9","Type":"ContainerDied","Data":"31a3893a0d5e8f87fee9d4718ac2c3d1025076b2df151bf341fd05039f3ea47d"} Apr 21 04:45:10.262089 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:10.262065 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j"] Apr 21 04:45:10.264612 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:45:10.264568 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98930cc8_217b_42db_b1f2_816573cc740a.slice/crio-8780df2f1384ac62583d3af7362e5b8cd4aae523cec4692cbe22e73aec00c122 WatchSource:0}: Error finding container 8780df2f1384ac62583d3af7362e5b8cd4aae523cec4692cbe22e73aec00c122: Status 404 returned error can't find the container with id 8780df2f1384ac62583d3af7362e5b8cd4aae523cec4692cbe22e73aec00c122 Apr 21 04:45:11.181705 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:11.181668 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2aad853-03e4-40bb-9021-807986fc82b9" containerID="2c4fe1922315610dd5d0aabd076138a669f174eecef782156bf0cf18030cb7a1" exitCode=0 Apr 21 04:45:11.182165 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:11.181809 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" event={"ID":"e2aad853-03e4-40bb-9021-807986fc82b9","Type":"ContainerDied","Data":"2c4fe1922315610dd5d0aabd076138a669f174eecef782156bf0cf18030cb7a1"} Apr 21 04:45:11.183211 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:11.183168 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" event={"ID":"98930cc8-217b-42db-b1f2-816573cc740a","Type":"ContainerStarted","Data":"8780df2f1384ac62583d3af7362e5b8cd4aae523cec4692cbe22e73aec00c122"} Apr 21 04:45:12.319764 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.319737 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:12.409786 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.409754 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hgptw"] Apr 21 04:45:12.410120 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.410107 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2aad853-03e4-40bb-9021-807986fc82b9" containerName="extract" Apr 21 04:45:12.410186 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.410122 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2aad853-03e4-40bb-9021-807986fc82b9" containerName="extract" Apr 21 04:45:12.410186 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.410134 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2aad853-03e4-40bb-9021-807986fc82b9" containerName="pull" Apr 21 04:45:12.410186 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.410140 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2aad853-03e4-40bb-9021-807986fc82b9" containerName="pull" Apr 21 04:45:12.410186 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.410148 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2aad853-03e4-40bb-9021-807986fc82b9" containerName="util" Apr 21 04:45:12.410186 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.410154 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2aad853-03e4-40bb-9021-807986fc82b9" containerName="util" Apr 21 04:45:12.410373 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.410206 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2aad853-03e4-40bb-9021-807986fc82b9" containerName="extract" Apr 21 04:45:12.412011 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.411994 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:12.414538 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.414511 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 21 04:45:12.414656 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.414604 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-fcmf8\"" Apr 21 04:45:12.416324 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.416304 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-bundle\") pod \"e2aad853-03e4-40bb-9021-807986fc82b9\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " Apr 21 04:45:12.416462 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.416342 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-util\") pod \"e2aad853-03e4-40bb-9021-807986fc82b9\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " Apr 21 04:45:12.416462 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.416384 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q5dq\" (UniqueName: \"kubernetes.io/projected/e2aad853-03e4-40bb-9021-807986fc82b9-kube-api-access-5q5dq\") pod \"e2aad853-03e4-40bb-9021-807986fc82b9\" (UID: \"e2aad853-03e4-40bb-9021-807986fc82b9\") " Apr 21 04:45:12.417471 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.417440 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-bundle" (OuterVolumeSpecName: "bundle") pod "e2aad853-03e4-40bb-9021-807986fc82b9" (UID: "e2aad853-03e4-40bb-9021-807986fc82b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:12.418586 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.418553 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2aad853-03e4-40bb-9021-807986fc82b9-kube-api-access-5q5dq" (OuterVolumeSpecName: "kube-api-access-5q5dq") pod "e2aad853-03e4-40bb-9021-807986fc82b9" (UID: "e2aad853-03e4-40bb-9021-807986fc82b9"). InnerVolumeSpecName "kube-api-access-5q5dq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:45:12.422116 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.422077 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hgptw"] Apr 21 04:45:12.424908 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.424871 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-util" (OuterVolumeSpecName: "util") pod "e2aad853-03e4-40bb-9021-807986fc82b9" (UID: "e2aad853-03e4-40bb-9021-807986fc82b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:12.518003 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.517898 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfdwh\" (UniqueName: \"kubernetes.io/projected/ad901c4d-2781-4a4c-b833-73dd358da08d-kube-api-access-jfdwh\") pod \"odh-model-controller-858dbf95b8-hgptw\" (UID: \"ad901c4d-2781-4a4c-b833-73dd358da08d\") " pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:12.518140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.518023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert\") pod \"odh-model-controller-858dbf95b8-hgptw\" (UID: \"ad901c4d-2781-4a4c-b833-73dd358da08d\") " pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:12.518140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.518112 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:45:12.518140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.518130 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2aad853-03e4-40bb-9021-807986fc82b9-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:45:12.518140 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.518140 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5q5dq\" (UniqueName: \"kubernetes.io/projected/e2aad853-03e4-40bb-9021-807986fc82b9-kube-api-access-5q5dq\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:45:12.619540 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.619505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert\") pod \"odh-model-controller-858dbf95b8-hgptw\" (UID: \"ad901c4d-2781-4a4c-b833-73dd358da08d\") " pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:12.619705 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.619608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfdwh\" (UniqueName: \"kubernetes.io/projected/ad901c4d-2781-4a4c-b833-73dd358da08d-kube-api-access-jfdwh\") pod \"odh-model-controller-858dbf95b8-hgptw\" (UID: \"ad901c4d-2781-4a4c-b833-73dd358da08d\") " pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:12.619705 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:12.619679 2579 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 04:45:12.619803 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:12.619754 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert podName:ad901c4d-2781-4a4c-b833-73dd358da08d nodeName:}" failed. No retries permitted until 2026-04-21 04:45:13.119732047 +0000 UTC m=+375.746408712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert") pod "odh-model-controller-858dbf95b8-hgptw" (UID: "ad901c4d-2781-4a4c-b833-73dd358da08d") : secret "odh-model-controller-webhook-cert" not found Apr 21 04:45:12.630526 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:12.630491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfdwh\" (UniqueName: \"kubernetes.io/projected/ad901c4d-2781-4a4c-b833-73dd358da08d-kube-api-access-jfdwh\") pod \"odh-model-controller-858dbf95b8-hgptw\" (UID: \"ad901c4d-2781-4a4c-b833-73dd358da08d\") " pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:13.125745 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:13.125691 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert\") pod \"odh-model-controller-858dbf95b8-hgptw\" (UID: \"ad901c4d-2781-4a4c-b833-73dd358da08d\") " pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:13.125923 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:13.125852 2579 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 04:45:13.125986 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:13.125932 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert podName:ad901c4d-2781-4a4c-b833-73dd358da08d nodeName:}" failed. No retries permitted until 2026-04-21 04:45:14.12591362 +0000 UTC m=+376.752590278 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert") pod "odh-model-controller-858dbf95b8-hgptw" (UID: "ad901c4d-2781-4a4c-b833-73dd358da08d") : secret "odh-model-controller-webhook-cert" not found Apr 21 04:45:13.194460 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:13.194411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" event={"ID":"e2aad853-03e4-40bb-9021-807986fc82b9","Type":"ContainerDied","Data":"70f727d1295d8c38e26b8616f695637f49c33597b42430a3960d91fd7f1c1b4d"} Apr 21 04:45:13.194460 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:13.194452 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f727d1295d8c38e26b8616f695637f49c33597b42430a3960d91fd7f1c1b4d" Apr 21 04:45:13.194692 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:13.194481 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355mksv" Apr 21 04:45:14.134317 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:14.134277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert\") pod \"odh-model-controller-858dbf95b8-hgptw\" (UID: \"ad901c4d-2781-4a4c-b833-73dd358da08d\") " pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:14.137137 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:14.137103 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad901c4d-2781-4a4c-b833-73dd358da08d-cert\") pod \"odh-model-controller-858dbf95b8-hgptw\" (UID: \"ad901c4d-2781-4a4c-b833-73dd358da08d\") " pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:14.204690 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:14.204647 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" event={"ID":"98930cc8-217b-42db-b1f2-816573cc740a","Type":"ContainerStarted","Data":"533778afc799773aadb7f459ea2e01bb6c3e8b1ba0748d4bc2b8c3864765a4e6"} Apr 21 04:45:14.222297 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:14.222247 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-6df9j" podStartSLOduration=2.219309756 podStartE2EDuration="5.222233871s" podCreationTimestamp="2026-04-21 04:45:09 +0000 UTC" firstStartedPulling="2026-04-21 04:45:10.266718192 +0000 UTC m=+372.893394855" lastFinishedPulling="2026-04-21 04:45:13.269642314 +0000 UTC m=+375.896318970" observedRunningTime="2026-04-21 04:45:14.220279397 +0000 UTC m=+376.846956074" watchObservedRunningTime="2026-04-21 04:45:14.222233871 +0000 UTC m=+376.848910548" Apr 21 04:45:14.231018 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:14.230983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:14.572159 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:14.572054 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hgptw"] Apr 21 04:45:14.574690 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:45:14.574649 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad901c4d_2781_4a4c_b833_73dd358da08d.slice/crio-7979606b88165c049ffd583530a3fbd692c39fbb25eb34f6db76fbb175115235 WatchSource:0}: Error finding container 7979606b88165c049ffd583530a3fbd692c39fbb25eb34f6db76fbb175115235: Status 404 returned error can't find the container with id 7979606b88165c049ffd583530a3fbd692c39fbb25eb34f6db76fbb175115235 Apr 21 04:45:15.209735 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:15.209694 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" event={"ID":"ad901c4d-2781-4a4c-b833-73dd358da08d","Type":"ContainerStarted","Data":"7979606b88165c049ffd583530a3fbd692c39fbb25eb34f6db76fbb175115235"} Apr 21 04:45:18.040278 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.040239 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b"] Apr 21 04:45:18.043905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.043881 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.050420 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.050343 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sjrjx\"" Apr 21 04:45:18.050646 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.050625 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:45:18.051272 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.051250 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:45:18.067061 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.067032 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b"] Apr 21 04:45:18.170266 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.170221 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.170447 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.170357 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.170447 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.170428 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlggp\" (UniqueName: \"kubernetes.io/projected/ed22f68e-6282-4452-b162-14446ff2929b-kube-api-access-qlggp\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.223023 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.222986 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad901c4d-2781-4a4c-b833-73dd358da08d" containerID="587d257d52fa783a3b339ade7c8e3a9fa990ac40ffcd570a767ba7c435a1b141" exitCode=1 Apr 21 04:45:18.223197 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.223064 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" event={"ID":"ad901c4d-2781-4a4c-b833-73dd358da08d","Type":"ContainerDied","Data":"587d257d52fa783a3b339ade7c8e3a9fa990ac40ffcd570a767ba7c435a1b141"} Apr 21 04:45:18.223298 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.223280 2579 scope.go:117] "RemoveContainer" containerID="587d257d52fa783a3b339ade7c8e3a9fa990ac40ffcd570a767ba7c435a1b141" Apr 21 04:45:18.271684 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.271653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.271860 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.271735 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.271860 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.271767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlggp\" (UniqueName: \"kubernetes.io/projected/ed22f68e-6282-4452-b162-14446ff2929b-kube-api-access-qlggp\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.272028 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.272008 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.272118 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.272102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.296715 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.296645 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlggp\" (UniqueName: \"kubernetes.io/projected/ed22f68e-6282-4452-b162-14446ff2929b-kube-api-access-qlggp\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.353592 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.353555 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:18.527885 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.527855 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b"] Apr 21 04:45:18.529281 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:45:18.529247 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded22f68e_6282_4452_b162_14446ff2929b.slice/crio-04359ace698fde58930491930ba5186a3e1ed15d3c6f751989696070055ebe95 WatchSource:0}: Error finding container 04359ace698fde58930491930ba5186a3e1ed15d3c6f751989696070055ebe95: Status 404 returned error can't find the container with id 04359ace698fde58930491930ba5186a3e1ed15d3c6f751989696070055ebe95 Apr 21 04:45:18.938665 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.938578 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-9fwhf"] Apr 21 04:45:18.941786 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.941769 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:18.944835 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.944808 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-mkqnx\"" Apr 21 04:45:18.944942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.944822 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 21 04:45:18.956295 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:18.956274 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-9fwhf"] Apr 21 04:45:19.077273 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.077237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrvj\" (UniqueName: \"kubernetes.io/projected/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-kube-api-access-2mrvj\") pod \"kserve-controller-manager-856948b99f-9fwhf\" (UID: \"9fb5f7b9-9184-474f-a0cf-7bde29f6547f\") " pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:19.077640 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.077333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert\") pod \"kserve-controller-manager-856948b99f-9fwhf\" (UID: \"9fb5f7b9-9184-474f-a0cf-7bde29f6547f\") " pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:19.178587 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.178545 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert\") pod \"kserve-controller-manager-856948b99f-9fwhf\" (UID: \"9fb5f7b9-9184-474f-a0cf-7bde29f6547f\") " pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:19.178757 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.178624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrvj\" (UniqueName: \"kubernetes.io/projected/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-kube-api-access-2mrvj\") pod \"kserve-controller-manager-856948b99f-9fwhf\" (UID: \"9fb5f7b9-9184-474f-a0cf-7bde29f6547f\") " pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:19.178757 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:19.178687 2579 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 04:45:19.178757 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:19.178755 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert podName:9fb5f7b9-9184-474f-a0cf-7bde29f6547f nodeName:}" failed. No retries permitted until 2026-04-21 04:45:19.678736756 +0000 UTC m=+382.305413412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert") pod "kserve-controller-manager-856948b99f-9fwhf" (UID: "9fb5f7b9-9184-474f-a0cf-7bde29f6547f") : secret "kserve-webhook-server-cert" not found Apr 21 04:45:19.201895 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.201820 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrvj\" (UniqueName: \"kubernetes.io/projected/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-kube-api-access-2mrvj\") pod \"kserve-controller-manager-856948b99f-9fwhf\" (UID: \"9fb5f7b9-9184-474f-a0cf-7bde29f6547f\") " pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:19.234099 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.234061 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad901c4d-2781-4a4c-b833-73dd358da08d" containerID="55ee208b148cb596727d32e59faf72c064d55c64b12c1084795becfeab3494f6" exitCode=1 Apr 21 04:45:19.234295 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.234142 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" event={"ID":"ad901c4d-2781-4a4c-b833-73dd358da08d","Type":"ContainerDied","Data":"55ee208b148cb596727d32e59faf72c064d55c64b12c1084795becfeab3494f6"} Apr 21 04:45:19.234295 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.234182 2579 scope.go:117] "RemoveContainer" containerID="587d257d52fa783a3b339ade7c8e3a9fa990ac40ffcd570a767ba7c435a1b141" Apr 21 04:45:19.234494 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.234472 2579 scope.go:117] "RemoveContainer" containerID="55ee208b148cb596727d32e59faf72c064d55c64b12c1084795becfeab3494f6" Apr 21 04:45:19.234752 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:19.234733 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-hgptw_opendatahub(ad901c4d-2781-4a4c-b833-73dd358da08d)\"" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" podUID="ad901c4d-2781-4a4c-b833-73dd358da08d" Apr 21 04:45:19.235727 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.235677 2579 generic.go:358] "Generic (PLEG): container finished" podID="ed22f68e-6282-4452-b162-14446ff2929b" containerID="a65e9cb75b9fd514fbcd4afb4432d1363870e2ddaef9f6f435828bd64fdca24c" exitCode=0 Apr 21 04:45:19.235872 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.235783 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" event={"ID":"ed22f68e-6282-4452-b162-14446ff2929b","Type":"ContainerDied","Data":"a65e9cb75b9fd514fbcd4afb4432d1363870e2ddaef9f6f435828bd64fdca24c"} Apr 21 04:45:19.235872 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.235808 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" event={"ID":"ed22f68e-6282-4452-b162-14446ff2929b","Type":"ContainerStarted","Data":"04359ace698fde58930491930ba5186a3e1ed15d3c6f751989696070055ebe95"} Apr 21 04:45:19.683287 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:19.683255 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert\") pod \"kserve-controller-manager-856948b99f-9fwhf\" (UID: \"9fb5f7b9-9184-474f-a0cf-7bde29f6547f\") " pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:19.683449 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:19.683419 2579 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 04:45:19.683489 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:19.683482 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert podName:9fb5f7b9-9184-474f-a0cf-7bde29f6547f nodeName:}" failed. No retries permitted until 2026-04-21 04:45:20.683465717 +0000 UTC m=+383.310142383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert") pod "kserve-controller-manager-856948b99f-9fwhf" (UID: "9fb5f7b9-9184-474f-a0cf-7bde29f6547f") : secret "kserve-webhook-server-cert" not found Apr 21 04:45:20.241831 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.241797 2579 scope.go:117] "RemoveContainer" containerID="55ee208b148cb596727d32e59faf72c064d55c64b12c1084795becfeab3494f6" Apr 21 04:45:20.242291 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:20.242070 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-hgptw_opendatahub(ad901c4d-2781-4a4c-b833-73dd358da08d)\"" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" podUID="ad901c4d-2781-4a4c-b833-73dd358da08d" Apr 21 04:45:20.243448 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.243418 2579 generic.go:358] "Generic (PLEG): container finished" podID="ed22f68e-6282-4452-b162-14446ff2929b" containerID="847f0aeee278ade0125fe31d00e3acbcd0c382f0b2cc9777e961fcc3cac9d2ac" exitCode=0 Apr 21 04:45:20.243617 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.243503 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" event={"ID":"ed22f68e-6282-4452-b162-14446ff2929b","Type":"ContainerDied","Data":"847f0aeee278ade0125fe31d00e3acbcd0c382f0b2cc9777e961fcc3cac9d2ac"} Apr 21 04:45:20.558113 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.558081 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-dpk88"] Apr 21 04:45:20.561455 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.561432 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:20.564388 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.564337 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-67lnw\"" Apr 21 04:45:20.564588 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.564571 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 04:45:20.564708 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.564687 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 04:45:20.583704 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.583675 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-dpk88"] Apr 21 04:45:20.691402 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.691341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert\") pod \"kserve-controller-manager-856948b99f-9fwhf\" (UID: \"9fb5f7b9-9184-474f-a0cf-7bde29f6547f\") " pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:20.691571 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.691423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d1c165cd-bdd1-4f68-9761-44fe2f680be5-operator-config\") pod \"servicemesh-operator3-55f49c5f94-dpk88\" (UID: \"d1c165cd-bdd1-4f68-9761-44fe2f680be5\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:20.691617 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.691583 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8k4f\" (UniqueName: \"kubernetes.io/projected/d1c165cd-bdd1-4f68-9761-44fe2f680be5-kube-api-access-g8k4f\") pod \"servicemesh-operator3-55f49c5f94-dpk88\" (UID: \"d1c165cd-bdd1-4f68-9761-44fe2f680be5\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:20.693775 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.693744 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fb5f7b9-9184-474f-a0cf-7bde29f6547f-cert\") pod \"kserve-controller-manager-856948b99f-9fwhf\" (UID: \"9fb5f7b9-9184-474f-a0cf-7bde29f6547f\") " pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:20.751840 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.751804 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:20.792872 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.792831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d1c165cd-bdd1-4f68-9761-44fe2f680be5-operator-config\") pod \"servicemesh-operator3-55f49c5f94-dpk88\" (UID: \"d1c165cd-bdd1-4f68-9761-44fe2f680be5\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:20.793075 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.792934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8k4f\" (UniqueName: \"kubernetes.io/projected/d1c165cd-bdd1-4f68-9761-44fe2f680be5-kube-api-access-g8k4f\") pod \"servicemesh-operator3-55f49c5f94-dpk88\" (UID: \"d1c165cd-bdd1-4f68-9761-44fe2f680be5\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:20.796256 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.796210 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d1c165cd-bdd1-4f68-9761-44fe2f680be5-operator-config\") pod \"servicemesh-operator3-55f49c5f94-dpk88\" (UID: \"d1c165cd-bdd1-4f68-9761-44fe2f680be5\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:20.802399 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.802353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8k4f\" (UniqueName: \"kubernetes.io/projected/d1c165cd-bdd1-4f68-9761-44fe2f680be5-kube-api-access-g8k4f\") pod \"servicemesh-operator3-55f49c5f94-dpk88\" (UID: \"d1c165cd-bdd1-4f68-9761-44fe2f680be5\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:20.877038 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.877003 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-9fwhf"] Apr 21 04:45:20.880269 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:45:20.880235 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb5f7b9_9184_474f_a0cf_7bde29f6547f.slice/crio-1d082c2e23e951915e722f4a9f16e8f4bf5e39aabeb39b9921d0112166203c4f WatchSource:0}: Error finding container 1d082c2e23e951915e722f4a9f16e8f4bf5e39aabeb39b9921d0112166203c4f: Status 404 returned error can't find the container with id 1d082c2e23e951915e722f4a9f16e8f4bf5e39aabeb39b9921d0112166203c4f Apr 21 04:45:20.881159 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:20.881140 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:21.039187 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:21.039161 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-dpk88"] Apr 21 04:45:21.040958 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:45:21.040925 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1c165cd_bdd1_4f68_9761_44fe2f680be5.slice/crio-0e758921dd2b0d8ec68260f9897b07dcdb2f2836338d7ada3f3bd3b15a537f30 WatchSource:0}: Error finding container 0e758921dd2b0d8ec68260f9897b07dcdb2f2836338d7ada3f3bd3b15a537f30: Status 404 returned error can't find the container with id 0e758921dd2b0d8ec68260f9897b07dcdb2f2836338d7ada3f3bd3b15a537f30 Apr 21 04:45:21.249598 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:21.249502 2579 generic.go:358] "Generic (PLEG): container finished" podID="ed22f68e-6282-4452-b162-14446ff2929b" containerID="fdfda7d513b384068bc84ed103d726156be692c6bdb256cd464a8296df875bd6" exitCode=0 Apr 21 04:45:21.250004 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:21.249589 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" event={"ID":"ed22f68e-6282-4452-b162-14446ff2929b","Type":"ContainerDied","Data":"fdfda7d513b384068bc84ed103d726156be692c6bdb256cd464a8296df875bd6"} Apr 21 04:45:21.250768 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:21.250747 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" event={"ID":"d1c165cd-bdd1-4f68-9761-44fe2f680be5","Type":"ContainerStarted","Data":"0e758921dd2b0d8ec68260f9897b07dcdb2f2836338d7ada3f3bd3b15a537f30"} Apr 21 04:45:21.251908 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:21.251886 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" event={"ID":"9fb5f7b9-9184-474f-a0cf-7bde29f6547f","Type":"ContainerStarted","Data":"1d082c2e23e951915e722f4a9f16e8f4bf5e39aabeb39b9921d0112166203c4f"} Apr 21 04:45:22.406953 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.406921 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:22.506100 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.506066 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlggp\" (UniqueName: \"kubernetes.io/projected/ed22f68e-6282-4452-b162-14446ff2929b-kube-api-access-qlggp\") pod \"ed22f68e-6282-4452-b162-14446ff2929b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " Apr 21 04:45:22.506278 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.506147 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-util\") pod \"ed22f68e-6282-4452-b162-14446ff2929b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " Apr 21 04:45:22.506278 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.506201 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-bundle\") pod \"ed22f68e-6282-4452-b162-14446ff2929b\" (UID: \"ed22f68e-6282-4452-b162-14446ff2929b\") " Apr 21 04:45:22.507464 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.507350 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-bundle" (OuterVolumeSpecName: "bundle") pod "ed22f68e-6282-4452-b162-14446ff2929b" (UID: "ed22f68e-6282-4452-b162-14446ff2929b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:22.508908 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.508827 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed22f68e-6282-4452-b162-14446ff2929b-kube-api-access-qlggp" (OuterVolumeSpecName: "kube-api-access-qlggp") pod "ed22f68e-6282-4452-b162-14446ff2929b" (UID: "ed22f68e-6282-4452-b162-14446ff2929b"). InnerVolumeSpecName "kube-api-access-qlggp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:45:22.514678 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.514650 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-util" (OuterVolumeSpecName: "util") pod "ed22f68e-6282-4452-b162-14446ff2929b" (UID: "ed22f68e-6282-4452-b162-14446ff2929b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:22.607535 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.607495 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qlggp\" (UniqueName: \"kubernetes.io/projected/ed22f68e-6282-4452-b162-14446ff2929b-kube-api-access-qlggp\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:45:22.607535 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.607530 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:45:22.607535 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:22.607540 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed22f68e-6282-4452-b162-14446ff2929b-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:45:23.261153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:23.261109 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" event={"ID":"ed22f68e-6282-4452-b162-14446ff2929b","Type":"ContainerDied","Data":"04359ace698fde58930491930ba5186a3e1ed15d3c6f751989696070055ebe95"} Apr 21 04:45:23.261153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:23.261145 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24q48b" Apr 21 04:45:23.261376 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:23.261148 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04359ace698fde58930491930ba5186a3e1ed15d3c6f751989696070055ebe95" Apr 21 04:45:24.231394 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.231276 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:24.231849 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.231755 2579 scope.go:117] "RemoveContainer" containerID="55ee208b148cb596727d32e59faf72c064d55c64b12c1084795becfeab3494f6" Apr 21 04:45:24.232051 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:45:24.232022 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-hgptw_opendatahub(ad901c4d-2781-4a4c-b833-73dd358da08d)\"" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" podUID="ad901c4d-2781-4a4c-b833-73dd358da08d" Apr 21 04:45:24.268128 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.268076 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" event={"ID":"d1c165cd-bdd1-4f68-9761-44fe2f680be5","Type":"ContainerStarted","Data":"6290867e7e256ae7c3509c8c32bd8633c544d4a7e2c8f1715de0fbd3bf0fcfdc"} Apr 21 04:45:24.268325 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.268177 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:24.270200 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.270168 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" event={"ID":"9fb5f7b9-9184-474f-a0cf-7bde29f6547f","Type":"ContainerStarted","Data":"a782cb2e11bd0ccf7ef68c67988c5222145429648305ce365e3d27acc7c99c66"} Apr 21 04:45:24.270455 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.270311 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:45:24.294576 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.294511 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" podStartSLOduration=1.363295801 podStartE2EDuration="4.294491589s" podCreationTimestamp="2026-04-21 04:45:20 +0000 UTC" firstStartedPulling="2026-04-21 04:45:21.043397923 +0000 UTC m=+383.670074580" lastFinishedPulling="2026-04-21 04:45:23.974593696 +0000 UTC m=+386.601270368" observedRunningTime="2026-04-21 04:45:24.289916846 +0000 UTC m=+386.916593536" watchObservedRunningTime="2026-04-21 04:45:24.294491589 +0000 UTC m=+386.921168268" Apr 21 04:45:24.311797 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.311753 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" podStartSLOduration=3.216385509 podStartE2EDuration="6.311739018s" podCreationTimestamp="2026-04-21 04:45:18 +0000 UTC" firstStartedPulling="2026-04-21 04:45:20.881721404 +0000 UTC m=+383.508398071" lastFinishedPulling="2026-04-21 04:45:23.977074924 +0000 UTC m=+386.603751580" observedRunningTime="2026-04-21 04:45:24.309880409 +0000 UTC m=+386.936557089" watchObservedRunningTime="2026-04-21 04:45:24.311739018 +0000 UTC m=+386.938415694" Apr 21 04:45:24.968110 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.968073 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn"] Apr 21 04:45:24.968612 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.968591 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed22f68e-6282-4452-b162-14446ff2929b" containerName="extract" Apr 21 04:45:24.968734 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.968613 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed22f68e-6282-4452-b162-14446ff2929b" containerName="extract" Apr 21 04:45:24.968734 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.968669 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed22f68e-6282-4452-b162-14446ff2929b" containerName="pull" Apr 21 04:45:24.968734 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.968677 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed22f68e-6282-4452-b162-14446ff2929b" containerName="pull" Apr 21 04:45:24.968734 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.968712 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed22f68e-6282-4452-b162-14446ff2929b" containerName="util" Apr 21 04:45:24.968734 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.968720 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed22f68e-6282-4452-b162-14446ff2929b" containerName="util" Apr 21 04:45:24.968979 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.968808 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed22f68e-6282-4452-b162-14446ff2929b" containerName="extract" Apr 21 04:45:24.977840 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.977819 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:24.981201 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.981163 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 04:45:24.981342 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.981320 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 04:45:24.981432 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.981417 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 04:45:24.981668 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.981650 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-t8fx5\"" Apr 21 04:45:24.982118 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.982090 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 04:45:24.984353 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:24.984330 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn"] Apr 21 04:45:25.130755 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.130722 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whzgz\" (UniqueName: \"kubernetes.io/projected/fd17ffd0-3d66-4970-b3f0-43338283c480-kube-api-access-whzgz\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.130905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.130771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.130905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.130837 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.130905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.130862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.131022 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.130940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/fd17ffd0-3d66-4970-b3f0-43338283c480-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.131022 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.130984 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.131082 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.131022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.232321 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.232225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.232321 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.232267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.232321 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.232305 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whzgz\" (UniqueName: \"kubernetes.io/projected/fd17ffd0-3d66-4970-b3f0-43338283c480-kube-api-access-whzgz\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.232905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.232331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.232905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.232386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.232905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.232422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.232905 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.232639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/fd17ffd0-3d66-4970-b3f0-43338283c480-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.233073 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.232971 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.234944 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.234914 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.235091 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.235072 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/fd17ffd0-3d66-4970-b3f0-43338283c480-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.235169 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.235144 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.235512 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.235489 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/fd17ffd0-3d66-4970-b3f0-43338283c480-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.240737 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.240712 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fd17ffd0-3d66-4970-b3f0-43338283c480-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.240959 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.240939 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whzgz\" (UniqueName: \"kubernetes.io/projected/fd17ffd0-3d66-4970-b3f0-43338283c480-kube-api-access-whzgz\") pod \"istiod-openshift-gateway-55ff986f96-wskqn\" (UID: \"fd17ffd0-3d66-4970-b3f0-43338283c480\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.290014 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.289976 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:25.437994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:25.437970 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn"] Apr 21 04:45:25.439896 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:45:25.439870 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd17ffd0_3d66_4970_b3f0_43338283c480.slice/crio-e470913d807218cd16cc3767c76a75e03c8eb55b8f906962d0de19ba529900c2 WatchSource:0}: Error finding container e470913d807218cd16cc3767c76a75e03c8eb55b8f906962d0de19ba529900c2: Status 404 returned error can't find the container with id e470913d807218cd16cc3767c76a75e03c8eb55b8f906962d0de19ba529900c2 Apr 21 04:45:26.279958 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:26.279913 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" event={"ID":"fd17ffd0-3d66-4970-b3f0-43338283c480","Type":"ContainerStarted","Data":"e470913d807218cd16cc3767c76a75e03c8eb55b8f906962d0de19ba529900c2"} Apr 21 04:45:28.071331 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:28.071291 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:45:28.071642 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:28.071359 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:45:28.290282 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:28.290244 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" event={"ID":"fd17ffd0-3d66-4970-b3f0-43338283c480","Type":"ContainerStarted","Data":"90aec50b0f7a2176b3a0daa90627a85edac03a8a058c2892478a943f6c906745"} Apr 21 04:45:28.290464 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:28.290340 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:28.312994 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:28.312945 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" podStartSLOduration=1.683987192 podStartE2EDuration="4.312930935s" podCreationTimestamp="2026-04-21 04:45:24 +0000 UTC" firstStartedPulling="2026-04-21 04:45:25.442115297 +0000 UTC m=+388.068791957" lastFinishedPulling="2026-04-21 04:45:28.071059031 +0000 UTC m=+390.697735700" observedRunningTime="2026-04-21 04:45:28.309467816 +0000 UTC m=+390.936144495" watchObservedRunningTime="2026-04-21 04:45:28.312930935 +0000 UTC m=+390.939607613" Apr 21 04:45:29.297022 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:29.296992 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wskqn" Apr 21 04:45:34.231418 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:34.231331 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:34.231913 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:34.231871 2579 scope.go:117] "RemoveContainer" containerID="55ee208b148cb596727d32e59faf72c064d55c64b12c1084795becfeab3494f6" Apr 21 04:45:35.276976 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:35.276944 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-dpk88" Apr 21 04:45:35.324609 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:35.324572 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" event={"ID":"ad901c4d-2781-4a4c-b833-73dd358da08d","Type":"ContainerStarted","Data":"0a67f5a1a3e2694d623328ba6a74480aa0309ac5697429171f4e0c19cf861875"} Apr 21 04:45:35.324815 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:35.324800 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:35.344666 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:35.344616 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" podStartSLOduration=3.371670998 podStartE2EDuration="23.34459989s" podCreationTimestamp="2026-04-21 04:45:12 +0000 UTC" firstStartedPulling="2026-04-21 04:45:14.576071827 +0000 UTC m=+377.202748488" lastFinishedPulling="2026-04-21 04:45:34.549000704 +0000 UTC m=+397.175677380" observedRunningTime="2026-04-21 04:45:35.342955206 +0000 UTC m=+397.969631885" watchObservedRunningTime="2026-04-21 04:45:35.34459989 +0000 UTC m=+397.971276567" Apr 21 04:45:46.332190 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:46.332155 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-hgptw" Apr 21 04:45:55.279668 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:45:55.279635 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-9fwhf" Apr 21 04:46:25.222128 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.222046 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6"] Apr 21 04:46:25.225677 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.225658 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.228334 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.228316 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:46:25.229517 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.229418 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pgtpf\"" Apr 21 04:46:25.229517 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.229471 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:46:25.233070 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.233050 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6"] Apr 21 04:46:25.342346 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.342301 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.342546 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.342446 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.342546 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.342490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv27j\" (UniqueName: \"kubernetes.io/projected/48c195b5-8367-4ab4-bc83-68ff346679da-kube-api-access-jv27j\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.443088 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.443050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.443267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.443123 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.443267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.443146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv27j\" (UniqueName: \"kubernetes.io/projected/48c195b5-8367-4ab4-bc83-68ff346679da-kube-api-access-jv27j\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.443508 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.443487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.443553 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.443522 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.451519 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.451492 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv27j\" (UniqueName: \"kubernetes.io/projected/48c195b5-8367-4ab4-bc83-68ff346679da-kube-api-access-jv27j\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.536470 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.536353 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:25.627680 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.627646 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h"] Apr 21 04:46:25.632433 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.632410 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.641942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.641715 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h"] Apr 21 04:46:25.666403 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.666358 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6"] Apr 21 04:46:25.668675 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:46:25.668649 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48c195b5_8367_4ab4_bc83_68ff346679da.slice/crio-61e7a2d4ced29690eb435e10049ef4a309fb71550c913b6239973dc28a44ebd8 WatchSource:0}: Error finding container 61e7a2d4ced29690eb435e10049ef4a309fb71550c913b6239973dc28a44ebd8: Status 404 returned error can't find the container with id 61e7a2d4ced29690eb435e10049ef4a309fb71550c913b6239973dc28a44ebd8 Apr 21 04:46:25.746075 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.746045 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.746195 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.746090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4cd4\" (UniqueName: \"kubernetes.io/projected/89c8de52-d4f7-41e0-9564-c4998283fee0-kube-api-access-h4cd4\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.746247 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.746191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.847091 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.847048 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.847247 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.847114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.847247 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.847140 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4cd4\" (UniqueName: \"kubernetes.io/projected/89c8de52-d4f7-41e0-9564-c4998283fee0-kube-api-access-h4cd4\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.847451 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.847429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.847506 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.847472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.855776 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.855755 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4cd4\" (UniqueName: \"kubernetes.io/projected/89c8de52-d4f7-41e0-9564-c4998283fee0-kube-api-access-h4cd4\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:25.945676 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:25.945637 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:26.068603 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.068576 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h"] Apr 21 04:46:26.070283 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:46:26.070255 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c8de52_d4f7_41e0_9564_c4998283fee0.slice/crio-76a64671531726bf89fe81539af7dbaadac7c07856f31a6fa9a17b2b8b3a9f83 WatchSource:0}: Error finding container 76a64671531726bf89fe81539af7dbaadac7c07856f31a6fa9a17b2b8b3a9f83: Status 404 returned error can't find the container with id 76a64671531726bf89fe81539af7dbaadac7c07856f31a6fa9a17b2b8b3a9f83 Apr 21 04:46:26.227868 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.227827 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg"] Apr 21 04:46:26.231856 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.231830 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.237752 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.237724 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg"] Apr 21 04:46:26.352045 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.352006 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.352220 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.352069 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.352220 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.352106 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnqf6\" (UniqueName: \"kubernetes.io/projected/c02fdf1f-f348-49e7-b758-6a6122200fb8-kube-api-access-jnqf6\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.453410 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.453301 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.453410 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.453358 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.453410 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.453400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnqf6\" (UniqueName: \"kubernetes.io/projected/c02fdf1f-f348-49e7-b758-6a6122200fb8-kube-api-access-jnqf6\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.453746 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.453719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.453810 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.453754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.462459 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.462422 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnqf6\" (UniqueName: \"kubernetes.io/projected/c02fdf1f-f348-49e7-b758-6a6122200fb8-kube-api-access-jnqf6\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.534281 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.534242 2579 generic.go:358] "Generic (PLEG): container finished" podID="48c195b5-8367-4ab4-bc83-68ff346679da" containerID="430b98a8b034b6b0baaa49c18bd79ef5e543db2451fb4b511cdfe7ac700f9085" exitCode=0 Apr 21 04:46:26.534484 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.534334 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" event={"ID":"48c195b5-8367-4ab4-bc83-68ff346679da","Type":"ContainerDied","Data":"430b98a8b034b6b0baaa49c18bd79ef5e543db2451fb4b511cdfe7ac700f9085"} Apr 21 04:46:26.534484 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.534390 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" event={"ID":"48c195b5-8367-4ab4-bc83-68ff346679da","Type":"ContainerStarted","Data":"61e7a2d4ced29690eb435e10049ef4a309fb71550c913b6239973dc28a44ebd8"} Apr 21 04:46:26.535749 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.535728 2579 generic.go:358] "Generic (PLEG): container finished" podID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerID="fb04ae152209e45675dd614c01317d0d8aafcb8c74060c183285a431248d14a6" exitCode=0 Apr 21 04:46:26.535859 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.535814 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" event={"ID":"89c8de52-d4f7-41e0-9564-c4998283fee0","Type":"ContainerDied","Data":"fb04ae152209e45675dd614c01317d0d8aafcb8c74060c183285a431248d14a6"} Apr 21 04:46:26.535859 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.535848 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" event={"ID":"89c8de52-d4f7-41e0-9564-c4998283fee0","Type":"ContainerStarted","Data":"76a64671531726bf89fe81539af7dbaadac7c07856f31a6fa9a17b2b8b3a9f83"} Apr 21 04:46:26.543638 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.543617 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:26.676062 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.676037 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg"] Apr 21 04:46:26.678065 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:46:26.678032 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc02fdf1f_f348_49e7_b758_6a6122200fb8.slice/crio-3e50680a10732a5fe6ef62ae09b4b59ff2f857f4f41e8b5ef211f3655ee32b5c WatchSource:0}: Error finding container 3e50680a10732a5fe6ef62ae09b4b59ff2f857f4f41e8b5ef211f3655ee32b5c: Status 404 returned error can't find the container with id 3e50680a10732a5fe6ef62ae09b4b59ff2f857f4f41e8b5ef211f3655ee32b5c Apr 21 04:46:26.823968 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.823933 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj"] Apr 21 04:46:26.827767 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.827744 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:26.835672 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.835642 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj"] Apr 21 04:46:26.962720 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.962625 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:26.962720 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.962708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:26.962914 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:26.962740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvd44\" (UniqueName: \"kubernetes.io/projected/687ba120-5872-4868-8d82-71688d3868b5-kube-api-access-wvd44\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:27.064243 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.064204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:27.064443 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.064269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:27.064443 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.064292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvd44\" (UniqueName: \"kubernetes.io/projected/687ba120-5872-4868-8d82-71688d3868b5-kube-api-access-wvd44\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:27.064677 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.064654 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:27.064744 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.064672 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:27.073993 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.073969 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvd44\" (UniqueName: \"kubernetes.io/projected/687ba120-5872-4868-8d82-71688d3868b5-kube-api-access-wvd44\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:27.154434 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.154402 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:27.494613 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.494579 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj"] Apr 21 04:46:27.532499 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:46:27.532456 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687ba120_5872_4868_8d82_71688d3868b5.slice/crio-d13d80817c548d10f4abebe740db57e43de3a04e9bcf4a796d63324f5d1ebe02 WatchSource:0}: Error finding container d13d80817c548d10f4abebe740db57e43de3a04e9bcf4a796d63324f5d1ebe02: Status 404 returned error can't find the container with id d13d80817c548d10f4abebe740db57e43de3a04e9bcf4a796d63324f5d1ebe02 Apr 21 04:46:27.541261 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.541231 2579 generic.go:358] "Generic (PLEG): container finished" podID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerID="150f6bb40874c691ae5054fe1fe6b31d3c2f8ade5ea0ba9131c6ec3b34942124" exitCode=0 Apr 21 04:46:27.541387 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.541308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" event={"ID":"89c8de52-d4f7-41e0-9564-c4998283fee0","Type":"ContainerDied","Data":"150f6bb40874c691ae5054fe1fe6b31d3c2f8ade5ea0ba9131c6ec3b34942124"} Apr 21 04:46:27.542573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.542550 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" event={"ID":"687ba120-5872-4868-8d82-71688d3868b5","Type":"ContainerStarted","Data":"d13d80817c548d10f4abebe740db57e43de3a04e9bcf4a796d63324f5d1ebe02"} Apr 21 04:46:27.543845 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.543824 2579 generic.go:358] "Generic (PLEG): container finished" podID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerID="bb89b6b1f9684dbe59f096e660316db73479e20e9d35c6798cfe2dbc38a63548" exitCode=0 Apr 21 04:46:27.543929 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.543905 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" event={"ID":"c02fdf1f-f348-49e7-b758-6a6122200fb8","Type":"ContainerDied","Data":"bb89b6b1f9684dbe59f096e660316db73479e20e9d35c6798cfe2dbc38a63548"} Apr 21 04:46:27.543929 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:27.543922 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" event={"ID":"c02fdf1f-f348-49e7-b758-6a6122200fb8","Type":"ContainerStarted","Data":"3e50680a10732a5fe6ef62ae09b4b59ff2f857f4f41e8b5ef211f3655ee32b5c"} Apr 21 04:46:28.549322 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:28.549286 2579 generic.go:358] "Generic (PLEG): container finished" podID="48c195b5-8367-4ab4-bc83-68ff346679da" containerID="12fc3d04d0f9a3a822f038ddc50ee9e360e844ff599777f3ae1d767296cffc2a" exitCode=0 Apr 21 04:46:28.549826 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:28.549382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" event={"ID":"48c195b5-8367-4ab4-bc83-68ff346679da","Type":"ContainerDied","Data":"12fc3d04d0f9a3a822f038ddc50ee9e360e844ff599777f3ae1d767296cffc2a"} Apr 21 04:46:28.551467 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:28.551444 2579 generic.go:358] "Generic (PLEG): container finished" podID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerID="fdaa78da7df6923ed808fe7a4aa5709aafda93badd0856ada591d63b8dd06f7a" exitCode=0 Apr 21 04:46:28.551562 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:28.551507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" event={"ID":"89c8de52-d4f7-41e0-9564-c4998283fee0","Type":"ContainerDied","Data":"fdaa78da7df6923ed808fe7a4aa5709aafda93badd0856ada591d63b8dd06f7a"} Apr 21 04:46:28.552882 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:28.552860 2579 generic.go:358] "Generic (PLEG): container finished" podID="687ba120-5872-4868-8d82-71688d3868b5" containerID="8014838e8c6297fad5c46a4e8522c756dfc44daffb9cfe728c434359d37600fc" exitCode=0 Apr 21 04:46:28.552971 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:28.552941 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" event={"ID":"687ba120-5872-4868-8d82-71688d3868b5","Type":"ContainerDied","Data":"8014838e8c6297fad5c46a4e8522c756dfc44daffb9cfe728c434359d37600fc"} Apr 21 04:46:28.554704 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:28.554685 2579 generic.go:358] "Generic (PLEG): container finished" podID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerID="69dd6768b9356df9b431521ce9a94451a0fa6ef53187f3eb85c9c355491d5530" exitCode=0 Apr 21 04:46:28.554787 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:28.554765 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" event={"ID":"c02fdf1f-f348-49e7-b758-6a6122200fb8","Type":"ContainerDied","Data":"69dd6768b9356df9b431521ce9a94451a0fa6ef53187f3eb85c9c355491d5530"} Apr 21 04:46:29.564149 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.564114 2579 generic.go:358] "Generic (PLEG): container finished" podID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerID="11c8ad58e7e8474a51655ee9d5d89bfb5c403d344948dbd7559f5ce393e2fa48" exitCode=0 Apr 21 04:46:29.564636 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.564187 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" event={"ID":"c02fdf1f-f348-49e7-b758-6a6122200fb8","Type":"ContainerDied","Data":"11c8ad58e7e8474a51655ee9d5d89bfb5c403d344948dbd7559f5ce393e2fa48"} Apr 21 04:46:29.565982 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.565958 2579 generic.go:358] "Generic (PLEG): container finished" podID="48c195b5-8367-4ab4-bc83-68ff346679da" containerID="3a4f4b9ef2c420b9fda804c313db82758b5c65239ef1b355721ad53eecafb904" exitCode=0 Apr 21 04:46:29.566097 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.566040 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" event={"ID":"48c195b5-8367-4ab4-bc83-68ff346679da","Type":"ContainerDied","Data":"3a4f4b9ef2c420b9fda804c313db82758b5c65239ef1b355721ad53eecafb904"} Apr 21 04:46:29.567443 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.567423 2579 generic.go:358] "Generic (PLEG): container finished" podID="687ba120-5872-4868-8d82-71688d3868b5" containerID="5709f69ef76f7f2bafcb4bcc6c4a80a81d6be563efa22c99091264eaf1701bfe" exitCode=0 Apr 21 04:46:29.567542 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.567505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" event={"ID":"687ba120-5872-4868-8d82-71688d3868b5","Type":"ContainerDied","Data":"5709f69ef76f7f2bafcb4bcc6c4a80a81d6be563efa22c99091264eaf1701bfe"} Apr 21 04:46:29.756218 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.756186 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:29.894196 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.894098 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4cd4\" (UniqueName: \"kubernetes.io/projected/89c8de52-d4f7-41e0-9564-c4998283fee0-kube-api-access-h4cd4\") pod \"89c8de52-d4f7-41e0-9564-c4998283fee0\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " Apr 21 04:46:29.894196 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.894196 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-bundle\") pod \"89c8de52-d4f7-41e0-9564-c4998283fee0\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " Apr 21 04:46:29.894462 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.894223 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-util\") pod \"89c8de52-d4f7-41e0-9564-c4998283fee0\" (UID: \"89c8de52-d4f7-41e0-9564-c4998283fee0\") " Apr 21 04:46:29.894713 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.894676 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-bundle" (OuterVolumeSpecName: "bundle") pod "89c8de52-d4f7-41e0-9564-c4998283fee0" (UID: "89c8de52-d4f7-41e0-9564-c4998283fee0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:46:29.896470 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.896442 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c8de52-d4f7-41e0-9564-c4998283fee0-kube-api-access-h4cd4" (OuterVolumeSpecName: "kube-api-access-h4cd4") pod "89c8de52-d4f7-41e0-9564-c4998283fee0" (UID: "89c8de52-d4f7-41e0-9564-c4998283fee0"). InnerVolumeSpecName "kube-api-access-h4cd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:46:29.900348 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.900321 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-util" (OuterVolumeSpecName: "util") pod "89c8de52-d4f7-41e0-9564-c4998283fee0" (UID: "89c8de52-d4f7-41e0-9564-c4998283fee0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:46:29.995694 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.995659 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h4cd4\" (UniqueName: \"kubernetes.io/projected/89c8de52-d4f7-41e0-9564-c4998283fee0-kube-api-access-h4cd4\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:29.995694 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.995691 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:29.995694 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:29.995702 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89c8de52-d4f7-41e0-9564-c4998283fee0-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:30.573347 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.573307 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" event={"ID":"89c8de52-d4f7-41e0-9564-c4998283fee0","Type":"ContainerDied","Data":"76a64671531726bf89fe81539af7dbaadac7c07856f31a6fa9a17b2b8b3a9f83"} Apr 21 04:46:30.573347 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.573339 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h" Apr 21 04:46:30.573824 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.573346 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76a64671531726bf89fe81539af7dbaadac7c07856f31a6fa9a17b2b8b3a9f83" Apr 21 04:46:30.575177 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.575153 2579 generic.go:358] "Generic (PLEG): container finished" podID="687ba120-5872-4868-8d82-71688d3868b5" containerID="339b398d2f6e6208f273a5e42ca1266d8b6c736c9abde8dba93ff8e621c33a7e" exitCode=0 Apr 21 04:46:30.575310 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.575241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" event={"ID":"687ba120-5872-4868-8d82-71688d3868b5","Type":"ContainerDied","Data":"339b398d2f6e6208f273a5e42ca1266d8b6c736c9abde8dba93ff8e621c33a7e"} Apr 21 04:46:30.743243 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.743223 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:30.746544 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.746526 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:30.907016 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.906983 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnqf6\" (UniqueName: \"kubernetes.io/projected/c02fdf1f-f348-49e7-b758-6a6122200fb8-kube-api-access-jnqf6\") pod \"c02fdf1f-f348-49e7-b758-6a6122200fb8\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " Apr 21 04:46:30.907203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.907089 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-util\") pod \"c02fdf1f-f348-49e7-b758-6a6122200fb8\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " Apr 21 04:46:30.907203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.907113 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-util\") pod \"48c195b5-8367-4ab4-bc83-68ff346679da\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " Apr 21 04:46:30.907203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.907138 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-bundle\") pod \"c02fdf1f-f348-49e7-b758-6a6122200fb8\" (UID: \"c02fdf1f-f348-49e7-b758-6a6122200fb8\") " Apr 21 04:46:30.907203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.907160 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv27j\" (UniqueName: \"kubernetes.io/projected/48c195b5-8367-4ab4-bc83-68ff346679da-kube-api-access-jv27j\") pod \"48c195b5-8367-4ab4-bc83-68ff346679da\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " Apr 21 04:46:30.907203 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.907185 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-bundle\") pod \"48c195b5-8367-4ab4-bc83-68ff346679da\" (UID: \"48c195b5-8367-4ab4-bc83-68ff346679da\") " Apr 21 04:46:30.907925 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.907897 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-bundle" (OuterVolumeSpecName: "bundle") pod "48c195b5-8367-4ab4-bc83-68ff346679da" (UID: "48c195b5-8367-4ab4-bc83-68ff346679da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:46:30.908031 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.907915 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-bundle" (OuterVolumeSpecName: "bundle") pod "c02fdf1f-f348-49e7-b758-6a6122200fb8" (UID: "c02fdf1f-f348-49e7-b758-6a6122200fb8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:46:30.909917 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.909893 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c02fdf1f-f348-49e7-b758-6a6122200fb8-kube-api-access-jnqf6" (OuterVolumeSpecName: "kube-api-access-jnqf6") pod "c02fdf1f-f348-49e7-b758-6a6122200fb8" (UID: "c02fdf1f-f348-49e7-b758-6a6122200fb8"). InnerVolumeSpecName "kube-api-access-jnqf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:46:30.910010 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.909925 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c195b5-8367-4ab4-bc83-68ff346679da-kube-api-access-jv27j" (OuterVolumeSpecName: "kube-api-access-jv27j") pod "48c195b5-8367-4ab4-bc83-68ff346679da" (UID: "48c195b5-8367-4ab4-bc83-68ff346679da"). InnerVolumeSpecName "kube-api-access-jv27j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:46:30.912517 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.912488 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-util" (OuterVolumeSpecName: "util") pod "48c195b5-8367-4ab4-bc83-68ff346679da" (UID: "48c195b5-8367-4ab4-bc83-68ff346679da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:46:30.912735 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:30.912714 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-util" (OuterVolumeSpecName: "util") pod "c02fdf1f-f348-49e7-b758-6a6122200fb8" (UID: "c02fdf1f-f348-49e7-b758-6a6122200fb8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:46:31.007899 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.007862 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jnqf6\" (UniqueName: \"kubernetes.io/projected/c02fdf1f-f348-49e7-b758-6a6122200fb8-kube-api-access-jnqf6\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:31.007899 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.007890 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:31.007899 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.007900 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:31.007899 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.007908 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c02fdf1f-f348-49e7-b758-6a6122200fb8-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:31.008153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.007917 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jv27j\" (UniqueName: \"kubernetes.io/projected/48c195b5-8367-4ab4-bc83-68ff346679da-kube-api-access-jv27j\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:31.008153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.007926 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c195b5-8367-4ab4-bc83-68ff346679da-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:31.581210 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.581183 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" Apr 21 04:46:31.581210 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.581190 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg" event={"ID":"c02fdf1f-f348-49e7-b758-6a6122200fb8","Type":"ContainerDied","Data":"3e50680a10732a5fe6ef62ae09b4b59ff2f857f4f41e8b5ef211f3655ee32b5c"} Apr 21 04:46:31.581717 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.581219 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e50680a10732a5fe6ef62ae09b4b59ff2f857f4f41e8b5ef211f3655ee32b5c" Apr 21 04:46:31.583040 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.583011 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" event={"ID":"48c195b5-8367-4ab4-bc83-68ff346679da","Type":"ContainerDied","Data":"61e7a2d4ced29690eb435e10049ef4a309fb71550c913b6239973dc28a44ebd8"} Apr 21 04:46:31.583158 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.583045 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6" Apr 21 04:46:31.583158 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.583046 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61e7a2d4ced29690eb435e10049ef4a309fb71550c913b6239973dc28a44ebd8" Apr 21 04:46:31.707457 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.707434 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:31.815198 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.815157 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-bundle\") pod \"687ba120-5872-4868-8d82-71688d3868b5\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " Apr 21 04:46:31.815198 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.815202 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvd44\" (UniqueName: \"kubernetes.io/projected/687ba120-5872-4868-8d82-71688d3868b5-kube-api-access-wvd44\") pod \"687ba120-5872-4868-8d82-71688d3868b5\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " Apr 21 04:46:31.815481 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.815229 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-util\") pod \"687ba120-5872-4868-8d82-71688d3868b5\" (UID: \"687ba120-5872-4868-8d82-71688d3868b5\") " Apr 21 04:46:31.815709 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.815674 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-bundle" (OuterVolumeSpecName: "bundle") pod "687ba120-5872-4868-8d82-71688d3868b5" (UID: "687ba120-5872-4868-8d82-71688d3868b5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:46:31.817406 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.817379 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687ba120-5872-4868-8d82-71688d3868b5-kube-api-access-wvd44" (OuterVolumeSpecName: "kube-api-access-wvd44") pod "687ba120-5872-4868-8d82-71688d3868b5" (UID: "687ba120-5872-4868-8d82-71688d3868b5"). InnerVolumeSpecName "kube-api-access-wvd44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:46:31.820429 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.820408 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-util" (OuterVolumeSpecName: "util") pod "687ba120-5872-4868-8d82-71688d3868b5" (UID: "687ba120-5872-4868-8d82-71688d3868b5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:46:31.916201 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.916173 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:31.916201 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.916200 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvd44\" (UniqueName: \"kubernetes.io/projected/687ba120-5872-4868-8d82-71688d3868b5-kube-api-access-wvd44\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:31.916385 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:31.916219 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/687ba120-5872-4868-8d82-71688d3868b5-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:46:32.588666 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:32.588629 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" event={"ID":"687ba120-5872-4868-8d82-71688d3868b5","Type":"ContainerDied","Data":"d13d80817c548d10f4abebe740db57e43de3a04e9bcf4a796d63324f5d1ebe02"} Apr 21 04:46:32.588666 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:32.588667 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13d80817c548d10f4abebe740db57e43de3a04e9bcf4a796d63324f5d1ebe02" Apr 21 04:46:32.589055 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:32.588707 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj" Apr 21 04:46:35.278460 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.278429 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59f6d74895-724gv"] Apr 21 04:46:35.278980 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.278961 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerName="util" Apr 21 04:46:35.279023 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.278984 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerName="util" Apr 21 04:46:35.279023 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.278996 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="687ba120-5872-4868-8d82-71688d3868b5" containerName="util" Apr 21 04:46:35.279023 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279006 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ba120-5872-4868-8d82-71688d3868b5" containerName="util" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279023 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48c195b5-8367-4ab4-bc83-68ff346679da" containerName="util" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279034 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c195b5-8367-4ab4-bc83-68ff346679da" containerName="util" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279043 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerName="extract" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279051 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerName="extract" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279067 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerName="extract" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279074 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerName="extract" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279085 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48c195b5-8367-4ab4-bc83-68ff346679da" containerName="extract" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279093 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c195b5-8367-4ab4-bc83-68ff346679da" containerName="extract" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279107 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerName="pull" Apr 21 04:46:35.279114 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279114 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerName="pull" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279128 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerName="util" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279136 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerName="util" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279148 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="687ba120-5872-4868-8d82-71688d3868b5" containerName="pull" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279155 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ba120-5872-4868-8d82-71688d3868b5" containerName="pull" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279163 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="687ba120-5872-4868-8d82-71688d3868b5" containerName="extract" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279172 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ba120-5872-4868-8d82-71688d3868b5" containerName="extract" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279194 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48c195b5-8367-4ab4-bc83-68ff346679da" containerName="pull" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279202 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c195b5-8367-4ab4-bc83-68ff346679da" containerName="pull" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279208 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerName="pull" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279213 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerName="pull" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279287 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="687ba120-5872-4868-8d82-71688d3868b5" containerName="extract" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279297 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="48c195b5-8367-4ab4-bc83-68ff346679da" containerName="extract" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279306 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c02fdf1f-f348-49e7-b758-6a6122200fb8" containerName="extract" Apr 21 04:46:35.279414 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.279313 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="89c8de52-d4f7-41e0-9564-c4998283fee0" containerName="extract" Apr 21 04:46:35.284652 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.284631 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.292949 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.292926 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59f6d74895-724gv"] Apr 21 04:46:35.447606 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.447568 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-oauth-serving-cert\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.447766 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.447619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wnb\" (UniqueName: \"kubernetes.io/projected/fee2d97b-4b30-43a9-8e85-2333a013d782-kube-api-access-m2wnb\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.447766 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.447657 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fee2d97b-4b30-43a9-8e85-2333a013d782-console-oauth-config\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.447766 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.447680 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fee2d97b-4b30-43a9-8e85-2333a013d782-console-serving-cert\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.447766 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.447709 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-service-ca\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.447931 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.447805 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-console-config\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.447931 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.447837 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-trusted-ca-bundle\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.549156 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.549129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fee2d97b-4b30-43a9-8e85-2333a013d782-console-serving-cert\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.549267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.549171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-service-ca\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.549267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.549241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-console-config\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.549267 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.549264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-trusted-ca-bundle\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.549409 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.549306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-oauth-serving-cert\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.549409 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.549352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wnb\" (UniqueName: \"kubernetes.io/projected/fee2d97b-4b30-43a9-8e85-2333a013d782-kube-api-access-m2wnb\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.549556 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.549525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fee2d97b-4b30-43a9-8e85-2333a013d782-console-oauth-config\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.550087 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.550058 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-service-ca\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.550087 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.550075 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-oauth-serving-cert\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.550230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.550153 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-console-config\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.550230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.550213 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fee2d97b-4b30-43a9-8e85-2333a013d782-trusted-ca-bundle\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.552264 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.552243 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fee2d97b-4b30-43a9-8e85-2333a013d782-console-oauth-config\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.552335 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.552321 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fee2d97b-4b30-43a9-8e85-2333a013d782-console-serving-cert\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.557722 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.557698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wnb\" (UniqueName: \"kubernetes.io/projected/fee2d97b-4b30-43a9-8e85-2333a013d782-kube-api-access-m2wnb\") pod \"console-59f6d74895-724gv\" (UID: \"fee2d97b-4b30-43a9-8e85-2333a013d782\") " pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.594811 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.594771 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:35.727538 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:35.727510 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59f6d74895-724gv"] Apr 21 04:46:35.729334 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:46:35.729311 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee2d97b_4b30_43a9_8e85_2333a013d782.slice/crio-2ed486405efed084315cd4f290e6ccb9cc0effa7e0c1e004490bc617ef5eec6c WatchSource:0}: Error finding container 2ed486405efed084315cd4f290e6ccb9cc0effa7e0c1e004490bc617ef5eec6c: Status 404 returned error can't find the container with id 2ed486405efed084315cd4f290e6ccb9cc0effa7e0c1e004490bc617ef5eec6c Apr 21 04:46:36.604777 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:36.604745 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f6d74895-724gv" event={"ID":"fee2d97b-4b30-43a9-8e85-2333a013d782","Type":"ContainerStarted","Data":"7836c7e7284e98968500801626506a6f70ab8864a377b2bdee960f18852783c7"} Apr 21 04:46:36.604777 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:36.604781 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59f6d74895-724gv" event={"ID":"fee2d97b-4b30-43a9-8e85-2333a013d782","Type":"ContainerStarted","Data":"2ed486405efed084315cd4f290e6ccb9cc0effa7e0c1e004490bc617ef5eec6c"} Apr 21 04:46:36.624665 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:36.624616 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59f6d74895-724gv" podStartSLOduration=1.624602539 podStartE2EDuration="1.624602539s" podCreationTimestamp="2026-04-21 04:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:46:36.623222124 +0000 UTC m=+459.249898816" watchObservedRunningTime="2026-04-21 04:46:36.624602539 +0000 UTC m=+459.251279217" Apr 21 04:46:45.595924 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:45.595879 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:45.596321 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:45.596011 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:45.600842 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:45.600818 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:45.641416 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:45.641389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59f6d74895-724gv" Apr 21 04:46:45.704792 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:45.704760 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8678864d96-bhm72"] Apr 21 04:46:46.106286 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.106240 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4dfgn"] Apr 21 04:46:46.111594 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.111562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" Apr 21 04:46:46.114773 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.114742 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:46:46.114930 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.114780 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-5pbpw\"" Apr 21 04:46:46.114930 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.114805 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:46:46.128067 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.128037 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4dfgn"] Apr 21 04:46:46.143332 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.143161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh48h\" (UniqueName: \"kubernetes.io/projected/1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13-kube-api-access-rh48h\") pod \"authorino-operator-657f44b778-4dfgn\" (UID: \"1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13\") " pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" Apr 21 04:46:46.244676 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.244633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rh48h\" (UniqueName: \"kubernetes.io/projected/1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13-kube-api-access-rh48h\") pod \"authorino-operator-657f44b778-4dfgn\" (UID: \"1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13\") " pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" Apr 21 04:46:46.256834 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.256795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh48h\" (UniqueName: \"kubernetes.io/projected/1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13-kube-api-access-rh48h\") pod \"authorino-operator-657f44b778-4dfgn\" (UID: \"1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13\") " pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" Apr 21 04:46:46.424340 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.424247 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" Apr 21 04:46:46.564478 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.564452 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4dfgn"] Apr 21 04:46:46.566229 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:46:46.566199 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e3f3eb9_cd95_443f_a8e6_5297ac3a8b13.slice/crio-1ef0b2e531b6f80818ab879af97d4d588ed229ec618f4d775385b18329acb00f WatchSource:0}: Error finding container 1ef0b2e531b6f80818ab879af97d4d588ed229ec618f4d775385b18329acb00f: Status 404 returned error can't find the container with id 1ef0b2e531b6f80818ab879af97d4d588ed229ec618f4d775385b18329acb00f Apr 21 04:46:46.642729 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:46.642691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" event={"ID":"1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13","Type":"ContainerStarted","Data":"1ef0b2e531b6f80818ab879af97d4d588ed229ec618f4d775385b18329acb00f"} Apr 21 04:46:48.652769 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:48.652679 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" event={"ID":"1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13","Type":"ContainerStarted","Data":"0e2713cd4477a7cf7b506f5584259ae0c177476dda1f67f844d56dbe2c6c9e51"} Apr 21 04:46:48.653124 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:48.652789 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" Apr 21 04:46:48.673052 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:48.673004 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" podStartSLOduration=0.8542919 podStartE2EDuration="2.672989612s" podCreationTimestamp="2026-04-21 04:46:46 +0000 UTC" firstStartedPulling="2026-04-21 04:46:46.568097643 +0000 UTC m=+469.194774299" lastFinishedPulling="2026-04-21 04:46:48.386795354 +0000 UTC m=+471.013472011" observedRunningTime="2026-04-21 04:46:48.670908939 +0000 UTC m=+471.297585618" watchObservedRunningTime="2026-04-21 04:46:48.672989612 +0000 UTC m=+471.299666286" Apr 21 04:46:59.659473 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:46:59.659438 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-4dfgn" Apr 21 04:47:02.383561 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.383528 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2"] Apr 21 04:47:02.388597 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.388575 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:02.391351 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.391327 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rxf4s\"" Apr 21 04:47:02.412431 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.412393 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2"] Apr 21 04:47:02.471053 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.471010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/89077e26-719b-4f17-87a8-374f267cd097-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wf9x2\" (UID: \"89077e26-719b-4f17-87a8-374f267cd097\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:02.471216 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.471118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2kl\" (UniqueName: \"kubernetes.io/projected/89077e26-719b-4f17-87a8-374f267cd097-kube-api-access-mn2kl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wf9x2\" (UID: \"89077e26-719b-4f17-87a8-374f267cd097\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:02.572153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.572108 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2kl\" (UniqueName: \"kubernetes.io/projected/89077e26-719b-4f17-87a8-374f267cd097-kube-api-access-mn2kl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wf9x2\" (UID: \"89077e26-719b-4f17-87a8-374f267cd097\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:02.572349 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.572209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/89077e26-719b-4f17-87a8-374f267cd097-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wf9x2\" (UID: \"89077e26-719b-4f17-87a8-374f267cd097\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:02.572640 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.572617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/89077e26-719b-4f17-87a8-374f267cd097-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wf9x2\" (UID: \"89077e26-719b-4f17-87a8-374f267cd097\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:02.582357 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.582327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2kl\" (UniqueName: \"kubernetes.io/projected/89077e26-719b-4f17-87a8-374f267cd097-kube-api-access-mn2kl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wf9x2\" (UID: \"89077e26-719b-4f17-87a8-374f267cd097\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:02.701599 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.701511 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:02.832224 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:02.832186 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2"] Apr 21 04:47:02.833896 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:47:02.833869 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89077e26_719b_4f17_87a8_374f267cd097.slice/crio-0bd5545448dd9c59e1ca6f0a038007f035d8b8a0779ab9b8ea0535b312c0aad1 WatchSource:0}: Error finding container 0bd5545448dd9c59e1ca6f0a038007f035d8b8a0779ab9b8ea0535b312c0aad1: Status 404 returned error can't find the container with id 0bd5545448dd9c59e1ca6f0a038007f035d8b8a0779ab9b8ea0535b312c0aad1 Apr 21 04:47:03.712002 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:03.711962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" event={"ID":"89077e26-719b-4f17-87a8-374f267cd097","Type":"ContainerStarted","Data":"0bd5545448dd9c59e1ca6f0a038007f035d8b8a0779ab9b8ea0535b312c0aad1"} Apr 21 04:47:07.730958 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:07.730867 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" event={"ID":"89077e26-719b-4f17-87a8-374f267cd097","Type":"ContainerStarted","Data":"5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681"} Apr 21 04:47:07.730958 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:07.730919 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:07.752859 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:07.752807 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" podStartSLOduration=1.162761616 podStartE2EDuration="5.752791189s" podCreationTimestamp="2026-04-21 04:47:02 +0000 UTC" firstStartedPulling="2026-04-21 04:47:02.836349732 +0000 UTC m=+485.463026388" lastFinishedPulling="2026-04-21 04:47:07.426379304 +0000 UTC m=+490.053055961" observedRunningTime="2026-04-21 04:47:07.750908353 +0000 UTC m=+490.377585031" watchObservedRunningTime="2026-04-21 04:47:07.752791189 +0000 UTC m=+490.379467867" Apr 21 04:47:10.725112 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:10.725041 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8678864d96-bhm72" podUID="2e7fbddb-2697-437d-82e4-8195343dbb73" containerName="console" containerID="cri-o://ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1" gracePeriod=15 Apr 21 04:47:10.970488 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:10.970464 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8678864d96-bhm72_2e7fbddb-2697-437d-82e4-8195343dbb73/console/0.log" Apr 21 04:47:10.970617 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:10.970526 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:47:11.051756 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.051723 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-serving-cert\") pod \"2e7fbddb-2697-437d-82e4-8195343dbb73\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " Apr 21 04:47:11.051942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.051784 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-trusted-ca-bundle\") pod \"2e7fbddb-2697-437d-82e4-8195343dbb73\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " Apr 21 04:47:11.051942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.051815 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-oauth-serving-cert\") pod \"2e7fbddb-2697-437d-82e4-8195343dbb73\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " Apr 21 04:47:11.051942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.051845 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-console-config\") pod \"2e7fbddb-2697-437d-82e4-8195343dbb73\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " Apr 21 04:47:11.051942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.051865 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbzqz\" (UniqueName: \"kubernetes.io/projected/2e7fbddb-2697-437d-82e4-8195343dbb73-kube-api-access-bbzqz\") pod \"2e7fbddb-2697-437d-82e4-8195343dbb73\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " Apr 21 04:47:11.051942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.051888 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-service-ca\") pod \"2e7fbddb-2697-437d-82e4-8195343dbb73\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " Apr 21 04:47:11.051942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.051918 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-oauth-config\") pod \"2e7fbddb-2697-437d-82e4-8195343dbb73\" (UID: \"2e7fbddb-2697-437d-82e4-8195343dbb73\") " Apr 21 04:47:11.052300 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.052266 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2e7fbddb-2697-437d-82e4-8195343dbb73" (UID: "2e7fbddb-2697-437d-82e4-8195343dbb73"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:47:11.052439 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.052401 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-console-config" (OuterVolumeSpecName: "console-config") pod "2e7fbddb-2697-437d-82e4-8195343dbb73" (UID: "2e7fbddb-2697-437d-82e4-8195343dbb73"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:47:11.052558 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.052517 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-service-ca" (OuterVolumeSpecName: "service-ca") pod "2e7fbddb-2697-437d-82e4-8195343dbb73" (UID: "2e7fbddb-2697-437d-82e4-8195343dbb73"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:47:11.052626 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.052601 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2e7fbddb-2697-437d-82e4-8195343dbb73" (UID: "2e7fbddb-2697-437d-82e4-8195343dbb73"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:47:11.054069 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.054047 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2e7fbddb-2697-437d-82e4-8195343dbb73" (UID: "2e7fbddb-2697-437d-82e4-8195343dbb73"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:47:11.054150 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.054074 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e7fbddb-2697-437d-82e4-8195343dbb73-kube-api-access-bbzqz" (OuterVolumeSpecName: "kube-api-access-bbzqz") pod "2e7fbddb-2697-437d-82e4-8195343dbb73" (UID: "2e7fbddb-2697-437d-82e4-8195343dbb73"). InnerVolumeSpecName "kube-api-access-bbzqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:47:11.054150 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.054144 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2e7fbddb-2697-437d-82e4-8195343dbb73" (UID: "2e7fbddb-2697-437d-82e4-8195343dbb73"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:47:11.153043 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.152997 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-serving-cert\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:11.153043 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.153034 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-trusted-ca-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:11.153043 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.153045 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-oauth-serving-cert\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:11.153297 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.153055 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-console-config\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:11.153297 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.153064 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bbzqz\" (UniqueName: \"kubernetes.io/projected/2e7fbddb-2697-437d-82e4-8195343dbb73-kube-api-access-bbzqz\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:11.153297 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.153075 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e7fbddb-2697-437d-82e4-8195343dbb73-service-ca\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:11.153297 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.153085 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e7fbddb-2697-437d-82e4-8195343dbb73-console-oauth-config\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:11.747238 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.747205 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8678864d96-bhm72_2e7fbddb-2697-437d-82e4-8195343dbb73/console/0.log" Apr 21 04:47:11.747748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.747265 2579 generic.go:358] "Generic (PLEG): container finished" podID="2e7fbddb-2697-437d-82e4-8195343dbb73" containerID="ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1" exitCode=2 Apr 21 04:47:11.747748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.747345 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8678864d96-bhm72" Apr 21 04:47:11.747748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.747351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8678864d96-bhm72" event={"ID":"2e7fbddb-2697-437d-82e4-8195343dbb73","Type":"ContainerDied","Data":"ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1"} Apr 21 04:47:11.747748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.747414 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8678864d96-bhm72" event={"ID":"2e7fbddb-2697-437d-82e4-8195343dbb73","Type":"ContainerDied","Data":"6626196f459eccfcca63ce4d50184e1ebfc8cc7aa116027a4c6ca0557a8ccc59"} Apr 21 04:47:11.747748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.747430 2579 scope.go:117] "RemoveContainer" containerID="ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1" Apr 21 04:47:11.756632 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.756609 2579 scope.go:117] "RemoveContainer" containerID="ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1" Apr 21 04:47:11.756869 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:47:11.756848 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1\": container with ID starting with ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1 not found: ID does not exist" containerID="ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1" Apr 21 04:47:11.756917 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.756878 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1"} err="failed to get container status \"ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1\": rpc error: code = NotFound desc = could not find container \"ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1\": container with ID starting with ad2c57db4d0064b16780dddc2caf1dd84882d41165de50f490a2d8b232f9dcd1 not found: ID does not exist" Apr 21 04:47:11.774937 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.774910 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8678864d96-bhm72"] Apr 21 04:47:11.785256 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.785230 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8678864d96-bhm72"] Apr 21 04:47:11.897637 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:11.897603 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e7fbddb-2697-437d-82e4-8195343dbb73" path="/var/lib/kubelet/pods/2e7fbddb-2697-437d-82e4-8195343dbb73/volumes" Apr 21 04:47:18.737239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:18.737208 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:19.274356 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.274313 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2"] Apr 21 04:47:19.274567 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.274544 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" podUID="89077e26-719b-4f17-87a8-374f267cd097" containerName="manager" containerID="cri-o://5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681" gracePeriod=10 Apr 21 04:47:19.524005 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.523979 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:19.624966 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.624923 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/89077e26-719b-4f17-87a8-374f267cd097-extensions-socket-volume\") pod \"89077e26-719b-4f17-87a8-374f267cd097\" (UID: \"89077e26-719b-4f17-87a8-374f267cd097\") " Apr 21 04:47:19.625144 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.625042 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn2kl\" (UniqueName: \"kubernetes.io/projected/89077e26-719b-4f17-87a8-374f267cd097-kube-api-access-mn2kl\") pod \"89077e26-719b-4f17-87a8-374f267cd097\" (UID: \"89077e26-719b-4f17-87a8-374f267cd097\") " Apr 21 04:47:19.625468 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.625430 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89077e26-719b-4f17-87a8-374f267cd097-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "89077e26-719b-4f17-87a8-374f267cd097" (UID: "89077e26-719b-4f17-87a8-374f267cd097"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:47:19.627083 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.627047 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89077e26-719b-4f17-87a8-374f267cd097-kube-api-access-mn2kl" (OuterVolumeSpecName: "kube-api-access-mn2kl") pod "89077e26-719b-4f17-87a8-374f267cd097" (UID: "89077e26-719b-4f17-87a8-374f267cd097"). InnerVolumeSpecName "kube-api-access-mn2kl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:47:19.726417 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.726357 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mn2kl\" (UniqueName: \"kubernetes.io/projected/89077e26-719b-4f17-87a8-374f267cd097-kube-api-access-mn2kl\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:19.726417 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.726413 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/89077e26-719b-4f17-87a8-374f267cd097-extensions-socket-volume\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:19.783389 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.783330 2579 generic.go:358] "Generic (PLEG): container finished" podID="89077e26-719b-4f17-87a8-374f267cd097" containerID="5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681" exitCode=0 Apr 21 04:47:19.783796 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.783434 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" Apr 21 04:47:19.783796 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.783436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" event={"ID":"89077e26-719b-4f17-87a8-374f267cd097","Type":"ContainerDied","Data":"5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681"} Apr 21 04:47:19.783796 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.783480 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2" event={"ID":"89077e26-719b-4f17-87a8-374f267cd097","Type":"ContainerDied","Data":"0bd5545448dd9c59e1ca6f0a038007f035d8b8a0779ab9b8ea0535b312c0aad1"} Apr 21 04:47:19.783796 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.783497 2579 scope.go:117] "RemoveContainer" containerID="5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681" Apr 21 04:47:19.793599 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.793580 2579 scope.go:117] "RemoveContainer" containerID="5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681" Apr 21 04:47:19.793866 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:47:19.793840 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681\": container with ID starting with 5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681 not found: ID does not exist" containerID="5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681" Apr 21 04:47:19.793962 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.793872 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681"} err="failed to get container status \"5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681\": rpc error: code = NotFound desc = could not find container \"5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681\": container with ID starting with 5e3fe7a45ccda3d07122364c5380f5b2eaa43c12f483dbefc7a60a8176f5a681 not found: ID does not exist" Apr 21 04:47:19.807591 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.807560 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2"] Apr 21 04:47:19.811582 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.811556 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wf9x2"] Apr 21 04:47:19.897790 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:19.897711 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89077e26-719b-4f17-87a8-374f267cd097" path="/var/lib/kubelet/pods/89077e26-719b-4f17-87a8-374f267cd097/volumes" Apr 21 04:47:39.668408 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.668354 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g5v"] Apr 21 04:47:39.668807 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.668749 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e7fbddb-2697-437d-82e4-8195343dbb73" containerName="console" Apr 21 04:47:39.668807 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.668761 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7fbddb-2697-437d-82e4-8195343dbb73" containerName="console" Apr 21 04:47:39.668807 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.668771 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89077e26-719b-4f17-87a8-374f267cd097" containerName="manager" Apr 21 04:47:39.668807 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.668777 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="89077e26-719b-4f17-87a8-374f267cd097" containerName="manager" Apr 21 04:47:39.668934 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.668835 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="89077e26-719b-4f17-87a8-374f267cd097" containerName="manager" Apr 21 04:47:39.668934 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.668843 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e7fbddb-2697-437d-82e4-8195343dbb73" containerName="console" Apr 21 04:47:39.675183 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.675163 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:39.677955 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.677928 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pgtpf\"" Apr 21 04:47:39.678186 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.678167 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 04:47:39.680851 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.680829 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g5v"] Apr 21 04:47:39.767456 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.767422 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g5v"] Apr 21 04:47:39.805678 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.805638 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aa01dc9a-1399-47e3-97cd-ddf18904f033-config-file\") pod \"limitador-limitador-7d549b5b-b2g5v\" (UID: \"aa01dc9a-1399-47e3-97cd-ddf18904f033\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:39.805851 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.805697 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx6tv\" (UniqueName: \"kubernetes.io/projected/aa01dc9a-1399-47e3-97cd-ddf18904f033-kube-api-access-sx6tv\") pod \"limitador-limitador-7d549b5b-b2g5v\" (UID: \"aa01dc9a-1399-47e3-97cd-ddf18904f033\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:39.907119 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.907076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aa01dc9a-1399-47e3-97cd-ddf18904f033-config-file\") pod \"limitador-limitador-7d549b5b-b2g5v\" (UID: \"aa01dc9a-1399-47e3-97cd-ddf18904f033\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:39.907296 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.907137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sx6tv\" (UniqueName: \"kubernetes.io/projected/aa01dc9a-1399-47e3-97cd-ddf18904f033-kube-api-access-sx6tv\") pod \"limitador-limitador-7d549b5b-b2g5v\" (UID: \"aa01dc9a-1399-47e3-97cd-ddf18904f033\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:39.907820 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.907798 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aa01dc9a-1399-47e3-97cd-ddf18904f033-config-file\") pod \"limitador-limitador-7d549b5b-b2g5v\" (UID: \"aa01dc9a-1399-47e3-97cd-ddf18904f033\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:39.915882 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.915848 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx6tv\" (UniqueName: \"kubernetes.io/projected/aa01dc9a-1399-47e3-97cd-ddf18904f033-kube-api-access-sx6tv\") pod \"limitador-limitador-7d549b5b-b2g5v\" (UID: \"aa01dc9a-1399-47e3-97cd-ddf18904f033\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:39.987554 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:39.987469 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:40.141882 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:40.141854 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g5v"] Apr 21 04:47:40.142986 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:47:40.142960 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa01dc9a_1399_47e3_97cd_ddf18904f033.slice/crio-b3ab42f84a5e9abd24ae80993911280ec1165c1d138131d84bb7b89abee906ff WatchSource:0}: Error finding container b3ab42f84a5e9abd24ae80993911280ec1165c1d138131d84bb7b89abee906ff: Status 404 returned error can't find the container with id b3ab42f84a5e9abd24ae80993911280ec1165c1d138131d84bb7b89abee906ff Apr 21 04:47:40.870178 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:40.870132 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" event={"ID":"aa01dc9a-1399-47e3-97cd-ddf18904f033","Type":"ContainerStarted","Data":"b3ab42f84a5e9abd24ae80993911280ec1165c1d138131d84bb7b89abee906ff"} Apr 21 04:47:42.879955 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:42.879919 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" event={"ID":"aa01dc9a-1399-47e3-97cd-ddf18904f033","Type":"ContainerStarted","Data":"c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5"} Apr 21 04:47:42.880402 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:42.879974 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:42.897419 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:42.897350 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" podStartSLOduration=1.262276021 podStartE2EDuration="3.89733413s" podCreationTimestamp="2026-04-21 04:47:39 +0000 UTC" firstStartedPulling="2026-04-21 04:47:40.14535263 +0000 UTC m=+522.772029290" lastFinishedPulling="2026-04-21 04:47:42.780410743 +0000 UTC m=+525.407087399" observedRunningTime="2026-04-21 04:47:42.895646957 +0000 UTC m=+525.522323636" watchObservedRunningTime="2026-04-21 04:47:42.89733413 +0000 UTC m=+525.524010842" Apr 21 04:47:53.885190 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:53.885107 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:54.827444 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:54.827403 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g5v"] Apr 21 04:47:54.827662 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:54.827619 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" podUID="aa01dc9a-1399-47e3-97cd-ddf18904f033" containerName="limitador" containerID="cri-o://c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5" gracePeriod=30 Apr 21 04:47:55.374007 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.373985 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:55.456493 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.456401 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx6tv\" (UniqueName: \"kubernetes.io/projected/aa01dc9a-1399-47e3-97cd-ddf18904f033-kube-api-access-sx6tv\") pod \"aa01dc9a-1399-47e3-97cd-ddf18904f033\" (UID: \"aa01dc9a-1399-47e3-97cd-ddf18904f033\") " Apr 21 04:47:55.456493 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.456467 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aa01dc9a-1399-47e3-97cd-ddf18904f033-config-file\") pod \"aa01dc9a-1399-47e3-97cd-ddf18904f033\" (UID: \"aa01dc9a-1399-47e3-97cd-ddf18904f033\") " Apr 21 04:47:55.456849 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.456823 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa01dc9a-1399-47e3-97cd-ddf18904f033-config-file" (OuterVolumeSpecName: "config-file") pod "aa01dc9a-1399-47e3-97cd-ddf18904f033" (UID: "aa01dc9a-1399-47e3-97cd-ddf18904f033"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:47:55.458558 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.458532 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa01dc9a-1399-47e3-97cd-ddf18904f033-kube-api-access-sx6tv" (OuterVolumeSpecName: "kube-api-access-sx6tv") pod "aa01dc9a-1399-47e3-97cd-ddf18904f033" (UID: "aa01dc9a-1399-47e3-97cd-ddf18904f033"). InnerVolumeSpecName "kube-api-access-sx6tv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:47:55.557310 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.557267 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sx6tv\" (UniqueName: \"kubernetes.io/projected/aa01dc9a-1399-47e3-97cd-ddf18904f033-kube-api-access-sx6tv\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:55.557310 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.557302 2579 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aa01dc9a-1399-47e3-97cd-ddf18904f033-config-file\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:47:55.729375 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.729267 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-fjjjw"] Apr 21 04:47:55.729749 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.729730 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa01dc9a-1399-47e3-97cd-ddf18904f033" containerName="limitador" Apr 21 04:47:55.729843 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.729752 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa01dc9a-1399-47e3-97cd-ddf18904f033" containerName="limitador" Apr 21 04:47:55.729896 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.729873 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa01dc9a-1399-47e3-97cd-ddf18904f033" containerName="limitador" Apr 21 04:47:55.733346 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.733317 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:47:55.735852 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.735827 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-kplcs\"" Apr 21 04:47:55.735988 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.735922 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 04:47:55.740865 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.740835 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-fjjjw"] Apr 21 04:47:55.759709 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.759678 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fa9ff1d9-e39b-4b32-9d13-1a6806799e4d-data\") pod \"postgres-868db5846d-fjjjw\" (UID: \"fa9ff1d9-e39b-4b32-9d13-1a6806799e4d\") " pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:47:55.759862 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.759765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8k87\" (UniqueName: \"kubernetes.io/projected/fa9ff1d9-e39b-4b32-9d13-1a6806799e4d-kube-api-access-g8k87\") pod \"postgres-868db5846d-fjjjw\" (UID: \"fa9ff1d9-e39b-4b32-9d13-1a6806799e4d\") " pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:47:55.860793 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.860759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8k87\" (UniqueName: \"kubernetes.io/projected/fa9ff1d9-e39b-4b32-9d13-1a6806799e4d-kube-api-access-g8k87\") pod \"postgres-868db5846d-fjjjw\" (UID: \"fa9ff1d9-e39b-4b32-9d13-1a6806799e4d\") " pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:47:55.860957 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.860823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fa9ff1d9-e39b-4b32-9d13-1a6806799e4d-data\") pod \"postgres-868db5846d-fjjjw\" (UID: \"fa9ff1d9-e39b-4b32-9d13-1a6806799e4d\") " pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:47:55.861153 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.861136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fa9ff1d9-e39b-4b32-9d13-1a6806799e4d-data\") pod \"postgres-868db5846d-fjjjw\" (UID: \"fa9ff1d9-e39b-4b32-9d13-1a6806799e4d\") " pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:47:55.869166 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.869140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8k87\" (UniqueName: \"kubernetes.io/projected/fa9ff1d9-e39b-4b32-9d13-1a6806799e4d-kube-api-access-g8k87\") pod \"postgres-868db5846d-fjjjw\" (UID: \"fa9ff1d9-e39b-4b32-9d13-1a6806799e4d\") " pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:47:55.931354 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.931317 2579 generic.go:358] "Generic (PLEG): container finished" podID="aa01dc9a-1399-47e3-97cd-ddf18904f033" containerID="c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5" exitCode=0 Apr 21 04:47:55.931553 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.931403 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" Apr 21 04:47:55.931553 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.931401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" event={"ID":"aa01dc9a-1399-47e3-97cd-ddf18904f033","Type":"ContainerDied","Data":"c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5"} Apr 21 04:47:55.931553 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.931510 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b2g5v" event={"ID":"aa01dc9a-1399-47e3-97cd-ddf18904f033","Type":"ContainerDied","Data":"b3ab42f84a5e9abd24ae80993911280ec1165c1d138131d84bb7b89abee906ff"} Apr 21 04:47:55.931926 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.931772 2579 scope.go:117] "RemoveContainer" containerID="c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5" Apr 21 04:47:55.947116 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.947093 2579 scope.go:117] "RemoveContainer" containerID="c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5" Apr 21 04:47:55.947421 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:47:55.947400 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5\": container with ID starting with c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5 not found: ID does not exist" containerID="c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5" Apr 21 04:47:55.947507 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.947428 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5"} err="failed to get container status \"c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5\": rpc error: code = NotFound desc = could not find container \"c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5\": container with ID starting with c352603dc5b5721d1895888cbf629c810fe94fe870bdf10c4ddf730114fcdfc5 not found: ID does not exist" Apr 21 04:47:55.956610 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.956581 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g5v"] Apr 21 04:47:55.959886 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:55.959861 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b2g5v"] Apr 21 04:47:56.046789 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:56.046701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:47:56.173519 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:56.173492 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-fjjjw"] Apr 21 04:47:56.175612 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:47:56.175574 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9ff1d9_e39b_4b32_9d13_1a6806799e4d.slice/crio-3c51e7ed396ca59d760fa2c1a6ed2bf17ba131561d4e91df463ede0f0f3afe59 WatchSource:0}: Error finding container 3c51e7ed396ca59d760fa2c1a6ed2bf17ba131561d4e91df463ede0f0f3afe59: Status 404 returned error can't find the container with id 3c51e7ed396ca59d760fa2c1a6ed2bf17ba131561d4e91df463ede0f0f3afe59 Apr 21 04:47:56.936167 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:56.936124 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-fjjjw" event={"ID":"fa9ff1d9-e39b-4b32-9d13-1a6806799e4d","Type":"ContainerStarted","Data":"3c51e7ed396ca59d760fa2c1a6ed2bf17ba131561d4e91df463ede0f0f3afe59"} Apr 21 04:47:57.899237 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:47:57.899204 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa01dc9a-1399-47e3-97cd-ddf18904f033" path="/var/lib/kubelet/pods/aa01dc9a-1399-47e3-97cd-ddf18904f033/volumes" Apr 21 04:48:01.192979 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:01.192952 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 04:48:01.958838 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:01.958794 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-fjjjw" event={"ID":"fa9ff1d9-e39b-4b32-9d13-1a6806799e4d","Type":"ContainerStarted","Data":"7f6a22ef6e532de92d7e7ea1e8a23e1cfda4e31dcc18047a8ed86d9d0f93500c"} Apr 21 04:48:01.959005 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:01.958874 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:48:01.977646 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:01.977586 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-fjjjw" podStartSLOduration=1.964459267 podStartE2EDuration="6.977570072s" podCreationTimestamp="2026-04-21 04:47:55 +0000 UTC" firstStartedPulling="2026-04-21 04:47:56.176895325 +0000 UTC m=+538.803571981" lastFinishedPulling="2026-04-21 04:48:01.19000613 +0000 UTC m=+543.816682786" observedRunningTime="2026-04-21 04:48:01.974896912 +0000 UTC m=+544.601573591" watchObservedRunningTime="2026-04-21 04:48:01.977570072 +0000 UTC m=+544.604246752" Apr 21 04:48:07.992322 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:07.992288 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-fjjjw" Apr 21 04:48:09.448198 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.448154 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb"] Apr 21 04:48:09.451731 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.451714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.454696 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.454670 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:48:09.454696 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.454690 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:48:09.454839 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.454678 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sjrjx\"" Apr 21 04:48:09.459475 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.459451 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb"] Apr 21 04:48:09.474317 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.474278 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wlv5\" (UniqueName: \"kubernetes.io/projected/5cc80bc7-54c4-42ac-984f-552f5c0ca151-kube-api-access-2wlv5\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.474501 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.474454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.474573 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.474537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.575227 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.575188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wlv5\" (UniqueName: \"kubernetes.io/projected/5cc80bc7-54c4-42ac-984f-552f5c0ca151-kube-api-access-2wlv5\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.575425 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.575250 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.575425 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.575292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.575730 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.575710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.575773 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.575721 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.585169 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.585138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wlv5\" (UniqueName: \"kubernetes.io/projected/5cc80bc7-54c4-42ac-984f-552f5c0ca151-kube-api-access-2wlv5\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.761772 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.761674 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:09.887561 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.887534 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb"] Apr 21 04:48:09.889335 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:48:09.889293 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cc80bc7_54c4_42ac_984f_552f5c0ca151.slice/crio-a17e1aed69509090f6c818ce9c25463263fd10fe2c9fe24d00a31e3c36f8f26a WatchSource:0}: Error finding container a17e1aed69509090f6c818ce9c25463263fd10fe2c9fe24d00a31e3c36f8f26a: Status 404 returned error can't find the container with id a17e1aed69509090f6c818ce9c25463263fd10fe2c9fe24d00a31e3c36f8f26a Apr 21 04:48:09.999965 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.999928 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" event={"ID":"5cc80bc7-54c4-42ac-984f-552f5c0ca151","Type":"ContainerStarted","Data":"5568639cf01bf032f566c5a24c8973161cf0d092599cd62dbd62c9485ca6bba5"} Apr 21 04:48:09.999965 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:09.999966 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" event={"ID":"5cc80bc7-54c4-42ac-984f-552f5c0ca151","Type":"ContainerStarted","Data":"a17e1aed69509090f6c818ce9c25463263fd10fe2c9fe24d00a31e3c36f8f26a"} Apr 21 04:48:11.005782 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:11.005621 2579 generic.go:358] "Generic (PLEG): container finished" podID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerID="5568639cf01bf032f566c5a24c8973161cf0d092599cd62dbd62c9485ca6bba5" exitCode=0 Apr 21 04:48:11.005782 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:11.005685 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" event={"ID":"5cc80bc7-54c4-42ac-984f-552f5c0ca151","Type":"ContainerDied","Data":"5568639cf01bf032f566c5a24c8973161cf0d092599cd62dbd62c9485ca6bba5"} Apr 21 04:48:13.014232 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:13.014198 2579 generic.go:358] "Generic (PLEG): container finished" podID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerID="a62a4607f6dd8e7edf9710fd2a7141629f84488643ed261042bf6ca1a830a234" exitCode=0 Apr 21 04:48:13.014612 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:13.014283 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" event={"ID":"5cc80bc7-54c4-42ac-984f-552f5c0ca151","Type":"ContainerDied","Data":"a62a4607f6dd8e7edf9710fd2a7141629f84488643ed261042bf6ca1a830a234"} Apr 21 04:48:14.020186 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:14.020150 2579 generic.go:358] "Generic (PLEG): container finished" podID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerID="5aa04f7486f4ff8dec992431ff5163adaf3e1068491cb37b784f272b9fcd6f05" exitCode=0 Apr 21 04:48:14.020583 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:14.020226 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" event={"ID":"5cc80bc7-54c4-42ac-984f-552f5c0ca151","Type":"ContainerDied","Data":"5aa04f7486f4ff8dec992431ff5163adaf3e1068491cb37b784f272b9fcd6f05"} Apr 21 04:48:15.156125 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.156101 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:15.226386 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.226335 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-bundle\") pod \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " Apr 21 04:48:15.226543 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.226405 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-util\") pod \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " Apr 21 04:48:15.226543 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.226449 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wlv5\" (UniqueName: \"kubernetes.io/projected/5cc80bc7-54c4-42ac-984f-552f5c0ca151-kube-api-access-2wlv5\") pod \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\" (UID: \"5cc80bc7-54c4-42ac-984f-552f5c0ca151\") " Apr 21 04:48:15.226955 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.226927 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-bundle" (OuterVolumeSpecName: "bundle") pod "5cc80bc7-54c4-42ac-984f-552f5c0ca151" (UID: "5cc80bc7-54c4-42ac-984f-552f5c0ca151"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:48:15.228559 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.228528 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc80bc7-54c4-42ac-984f-552f5c0ca151-kube-api-access-2wlv5" (OuterVolumeSpecName: "kube-api-access-2wlv5") pod "5cc80bc7-54c4-42ac-984f-552f5c0ca151" (UID: "5cc80bc7-54c4-42ac-984f-552f5c0ca151"). InnerVolumeSpecName "kube-api-access-2wlv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:48:15.230859 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.230822 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-util" (OuterVolumeSpecName: "util") pod "5cc80bc7-54c4-42ac-984f-552f5c0ca151" (UID: "5cc80bc7-54c4-42ac-984f-552f5c0ca151"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:48:15.327495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.327458 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-bundle\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:48:15.327495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.327489 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5cc80bc7-54c4-42ac-984f-552f5c0ca151-util\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:48:15.327495 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:15.327500 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2wlv5\" (UniqueName: \"kubernetes.io/projected/5cc80bc7-54c4-42ac-984f-552f5c0ca151-kube-api-access-2wlv5\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:48:16.030460 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:16.030427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" event={"ID":"5cc80bc7-54c4-42ac-984f-552f5c0ca151","Type":"ContainerDied","Data":"a17e1aed69509090f6c818ce9c25463263fd10fe2c9fe24d00a31e3c36f8f26a"} Apr 21 04:48:16.030460 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:16.030460 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a17e1aed69509090f6c818ce9c25463263fd10fe2c9fe24d00a31e3c36f8f26a" Apr 21 04:48:16.030659 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:16.030491 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350tm8zb" Apr 21 04:48:22.748431 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.748391 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-6q4tb"] Apr 21 04:48:22.748951 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.748930 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerName="util" Apr 21 04:48:22.749036 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.748954 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerName="util" Apr 21 04:48:22.749036 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.748981 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerName="pull" Apr 21 04:48:22.749036 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.748990 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerName="pull" Apr 21 04:48:22.749036 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.749017 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerName="extract" Apr 21 04:48:22.749036 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.749026 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerName="extract" Apr 21 04:48:22.749270 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.749114 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cc80bc7-54c4-42ac-984f-552f5c0ca151" containerName="extract" Apr 21 04:48:22.752424 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.752401 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-6q4tb" Apr 21 04:48:22.755171 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.755151 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-jqktc\"" Apr 21 04:48:22.755287 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.755253 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 04:48:22.755287 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.755261 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 04:48:22.759568 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.759543 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-6q4tb"] Apr 21 04:48:22.897738 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.897702 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls5dm\" (UniqueName: \"kubernetes.io/projected/28fb03e0-6307-4f02-ab1c-39981e2c6802-kube-api-access-ls5dm\") pod \"keycloak-operator-5c4df598dd-6q4tb\" (UID: \"28fb03e0-6307-4f02-ab1c-39981e2c6802\") " pod="keycloak-system/keycloak-operator-5c4df598dd-6q4tb" Apr 21 04:48:22.999129 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:22.999026 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ls5dm\" (UniqueName: \"kubernetes.io/projected/28fb03e0-6307-4f02-ab1c-39981e2c6802-kube-api-access-ls5dm\") pod \"keycloak-operator-5c4df598dd-6q4tb\" (UID: \"28fb03e0-6307-4f02-ab1c-39981e2c6802\") " pod="keycloak-system/keycloak-operator-5c4df598dd-6q4tb" Apr 21 04:48:23.009232 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:23.009196 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls5dm\" (UniqueName: \"kubernetes.io/projected/28fb03e0-6307-4f02-ab1c-39981e2c6802-kube-api-access-ls5dm\") pod \"keycloak-operator-5c4df598dd-6q4tb\" (UID: \"28fb03e0-6307-4f02-ab1c-39981e2c6802\") " pod="keycloak-system/keycloak-operator-5c4df598dd-6q4tb" Apr 21 04:48:23.064917 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:23.064887 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-6q4tb" Apr 21 04:48:23.218113 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:23.218087 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-6q4tb"] Apr 21 04:48:23.220940 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:48:23.220905 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28fb03e0_6307_4f02_ab1c_39981e2c6802.slice/crio-34bc9ffee65fcdde5277ba82a2e2734f7d3b3e60ef87f374f75bfa946d079627 WatchSource:0}: Error finding container 34bc9ffee65fcdde5277ba82a2e2734f7d3b3e60ef87f374f75bfa946d079627: Status 404 returned error can't find the container with id 34bc9ffee65fcdde5277ba82a2e2734f7d3b3e60ef87f374f75bfa946d079627 Apr 21 04:48:24.065987 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:24.065936 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-6q4tb" event={"ID":"28fb03e0-6307-4f02-ab1c-39981e2c6802","Type":"ContainerStarted","Data":"34bc9ffee65fcdde5277ba82a2e2734f7d3b3e60ef87f374f75bfa946d079627"} Apr 21 04:48:29.092249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:29.092210 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-6q4tb" event={"ID":"28fb03e0-6307-4f02-ab1c-39981e2c6802","Type":"ContainerStarted","Data":"50746ac19df98715c950ed59c9606ff9a3c56dc703e51a7d919d18fa2738506e"} Apr 21 04:48:29.110716 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:29.110665 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-6q4tb" podStartSLOduration=1.431586801 podStartE2EDuration="7.110650758s" podCreationTimestamp="2026-04-21 04:48:22 +0000 UTC" firstStartedPulling="2026-04-21 04:48:23.222415533 +0000 UTC m=+565.849092190" lastFinishedPulling="2026-04-21 04:48:28.901479492 +0000 UTC m=+571.528156147" observedRunningTime="2026-04-21 04:48:29.108951907 +0000 UTC m=+571.735628586" watchObservedRunningTime="2026-04-21 04:48:29.110650758 +0000 UTC m=+571.737327441" Apr 21 04:48:57.811763 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:57.811736 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:48:57.811763 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:48:57.811753 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dv4qj_6b281d59-c062-4407-95da-057a82e47cba/ovn-acl-logging/0.log" Apr 21 04:49:10.532920 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.532883 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8ftm4"] Apr 21 04:49:10.542513 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.542481 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" Apr 21 04:49:10.545492 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.545467 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-n4cb5\"" Apr 21 04:49:10.545625 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.545589 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8ftm4"] Apr 21 04:49:10.628232 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.628193 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxjk6\" (UniqueName: \"kubernetes.io/projected/00b9d704-f491-49fe-99f8-615341813f18-kube-api-access-pxjk6\") pod \"maas-controller-6d4c8f55f9-8ftm4\" (UID: \"00b9d704-f491-49fe-99f8-615341813f18\") " pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" Apr 21 04:49:10.682933 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.682889 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-84fbf49c54-ccsc7"] Apr 21 04:49:10.686719 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.686698 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:10.695778 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.695749 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-84fbf49c54-ccsc7"] Apr 21 04:49:10.729636 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.729591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxjk6\" (UniqueName: \"kubernetes.io/projected/00b9d704-f491-49fe-99f8-615341813f18-kube-api-access-pxjk6\") pod \"maas-controller-6d4c8f55f9-8ftm4\" (UID: \"00b9d704-f491-49fe-99f8-615341813f18\") " pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" Apr 21 04:49:10.741063 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.741031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxjk6\" (UniqueName: \"kubernetes.io/projected/00b9d704-f491-49fe-99f8-615341813f18-kube-api-access-pxjk6\") pod \"maas-controller-6d4c8f55f9-8ftm4\" (UID: \"00b9d704-f491-49fe-99f8-615341813f18\") " pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" Apr 21 04:49:10.804431 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.804395 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8ftm4"] Apr 21 04:49:10.804687 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.804675 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" Apr 21 04:49:10.830748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.830712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krbvs\" (UniqueName: \"kubernetes.io/projected/f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5-kube-api-access-krbvs\") pod \"maas-controller-84fbf49c54-ccsc7\" (UID: \"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5\") " pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:10.931836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.931805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krbvs\" (UniqueName: \"kubernetes.io/projected/f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5-kube-api-access-krbvs\") pod \"maas-controller-84fbf49c54-ccsc7\" (UID: \"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5\") " pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:10.951155 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.950971 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8ftm4"] Apr 21 04:49:10.955249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.955111 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krbvs\" (UniqueName: \"kubernetes.io/projected/f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5-kube-api-access-krbvs\") pod \"maas-controller-84fbf49c54-ccsc7\" (UID: \"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5\") " pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:10.957959 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:49:10.957933 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b9d704_f491_49fe_99f8_615341813f18.slice/crio-958f78124dcdef83fc4388dcd78968ab73c9beb15c5fd02f171e85c8f1c140fd WatchSource:0}: Error finding container 958f78124dcdef83fc4388dcd78968ab73c9beb15c5fd02f171e85c8f1c140fd: Status 404 returned error can't find the container with id 958f78124dcdef83fc4388dcd78968ab73c9beb15c5fd02f171e85c8f1c140fd Apr 21 04:49:10.960010 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.959986 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:49:10.998419 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:10.998349 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:11.147382 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:11.144672 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-84fbf49c54-ccsc7"] Apr 21 04:49:11.265740 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:11.265704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" event={"ID":"00b9d704-f491-49fe-99f8-615341813f18","Type":"ContainerStarted","Data":"958f78124dcdef83fc4388dcd78968ab73c9beb15c5fd02f171e85c8f1c140fd"} Apr 21 04:49:11.266840 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:11.266813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" event={"ID":"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5","Type":"ContainerStarted","Data":"3cec40451d4bb976765aea6f28f2fe0de08ad9f7aab166a6480e199c9a06389e"} Apr 21 04:49:14.283955 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.283852 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" event={"ID":"00b9d704-f491-49fe-99f8-615341813f18","Type":"ContainerStarted","Data":"f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80"} Apr 21 04:49:14.284422 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.283946 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" podUID="00b9d704-f491-49fe-99f8-615341813f18" containerName="manager" containerID="cri-o://f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80" gracePeriod=10 Apr 21 04:49:14.284422 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.283984 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" Apr 21 04:49:14.285438 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.285402 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" event={"ID":"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5","Type":"ContainerStarted","Data":"69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b"} Apr 21 04:49:14.285556 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.285529 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:14.302146 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.302106 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" podStartSLOduration=1.3767935740000001 podStartE2EDuration="4.30209216s" podCreationTimestamp="2026-04-21 04:49:10 +0000 UTC" firstStartedPulling="2026-04-21 04:49:10.960186366 +0000 UTC m=+613.586863025" lastFinishedPulling="2026-04-21 04:49:13.885484954 +0000 UTC m=+616.512161611" observedRunningTime="2026-04-21 04:49:14.300201491 +0000 UTC m=+616.926878168" watchObservedRunningTime="2026-04-21 04:49:14.30209216 +0000 UTC m=+616.928768898" Apr 21 04:49:14.324269 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.324211 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" podStartSLOduration=1.586343137 podStartE2EDuration="4.324190021s" podCreationTimestamp="2026-04-21 04:49:10 +0000 UTC" firstStartedPulling="2026-04-21 04:49:11.150411072 +0000 UTC m=+613.777087731" lastFinishedPulling="2026-04-21 04:49:13.888257959 +0000 UTC m=+616.514934615" observedRunningTime="2026-04-21 04:49:14.320571251 +0000 UTC m=+616.947247932" watchObservedRunningTime="2026-04-21 04:49:14.324190021 +0000 UTC m=+616.950866740" Apr 21 04:49:14.517258 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.517231 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" Apr 21 04:49:14.670323 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.670289 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxjk6\" (UniqueName: \"kubernetes.io/projected/00b9d704-f491-49fe-99f8-615341813f18-kube-api-access-pxjk6\") pod \"00b9d704-f491-49fe-99f8-615341813f18\" (UID: \"00b9d704-f491-49fe-99f8-615341813f18\") " Apr 21 04:49:14.672545 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.672512 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b9d704-f491-49fe-99f8-615341813f18-kube-api-access-pxjk6" (OuterVolumeSpecName: "kube-api-access-pxjk6") pod "00b9d704-f491-49fe-99f8-615341813f18" (UID: "00b9d704-f491-49fe-99f8-615341813f18"). InnerVolumeSpecName "kube-api-access-pxjk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:49:14.771610 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:14.771575 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxjk6\" (UniqueName: \"kubernetes.io/projected/00b9d704-f491-49fe-99f8-615341813f18-kube-api-access-pxjk6\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:49:15.290653 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.290618 2579 generic.go:358] "Generic (PLEG): container finished" podID="00b9d704-f491-49fe-99f8-615341813f18" containerID="f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80" exitCode=0 Apr 21 04:49:15.291119 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.290692 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" Apr 21 04:49:15.291119 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.290695 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" event={"ID":"00b9d704-f491-49fe-99f8-615341813f18","Type":"ContainerDied","Data":"f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80"} Apr 21 04:49:15.291119 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.290795 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8ftm4" event={"ID":"00b9d704-f491-49fe-99f8-615341813f18","Type":"ContainerDied","Data":"958f78124dcdef83fc4388dcd78968ab73c9beb15c5fd02f171e85c8f1c140fd"} Apr 21 04:49:15.291119 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.290810 2579 scope.go:117] "RemoveContainer" containerID="f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80" Apr 21 04:49:15.299601 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.299585 2579 scope.go:117] "RemoveContainer" containerID="f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80" Apr 21 04:49:15.299871 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:49:15.299852 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80\": container with ID starting with f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80 not found: ID does not exist" containerID="f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80" Apr 21 04:49:15.299923 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.299882 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80"} err="failed to get container status \"f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80\": rpc error: code = NotFound desc = could not find container \"f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80\": container with ID starting with f3a7a6654e8b09345d1f1a5abe28e1a274387d86ab5c00abfc5e1f02f9173a80 not found: ID does not exist" Apr 21 04:49:15.312784 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.312756 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8ftm4"] Apr 21 04:49:15.317280 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.317248 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8ftm4"] Apr 21 04:49:15.896922 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:15.896885 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b9d704-f491-49fe-99f8-615341813f18" path="/var/lib/kubelet/pods/00b9d704-f491-49fe-99f8-615341813f18/volumes" Apr 21 04:49:24.754748 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:24.754656 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-84fbf49c54-ccsc7"] Apr 21 04:49:24.755214 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:24.754992 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" podUID="f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5" containerName="manager" containerID="cri-o://69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b" gracePeriod=10 Apr 21 04:49:24.762765 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:24.762736 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:25.000528 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.000502 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:25.039900 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.039819 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-554ff88cc4-nb4wx"] Apr 21 04:49:25.040197 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.040185 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00b9d704-f491-49fe-99f8-615341813f18" containerName="manager" Apr 21 04:49:25.040248 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.040199 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b9d704-f491-49fe-99f8-615341813f18" containerName="manager" Apr 21 04:49:25.040248 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.040221 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5" containerName="manager" Apr 21 04:49:25.040248 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.040228 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5" containerName="manager" Apr 21 04:49:25.040337 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.040293 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="00b9d704-f491-49fe-99f8-615341813f18" containerName="manager" Apr 21 04:49:25.040337 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.040304 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5" containerName="manager" Apr 21 04:49:25.043564 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.043549 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-554ff88cc4-nb4wx" Apr 21 04:49:25.052727 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.052704 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-554ff88cc4-nb4wx"] Apr 21 04:49:25.171892 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.171856 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krbvs\" (UniqueName: \"kubernetes.io/projected/f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5-kube-api-access-krbvs\") pod \"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5\" (UID: \"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5\") " Apr 21 04:49:25.172151 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.172127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzg7r\" (UniqueName: \"kubernetes.io/projected/345c64b0-5e6b-4184-b796-ba834837175f-kube-api-access-gzg7r\") pod \"maas-controller-554ff88cc4-nb4wx\" (UID: \"345c64b0-5e6b-4184-b796-ba834837175f\") " pod="opendatahub/maas-controller-554ff88cc4-nb4wx" Apr 21 04:49:25.174065 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.174039 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5-kube-api-access-krbvs" (OuterVolumeSpecName: "kube-api-access-krbvs") pod "f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5" (UID: "f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5"). InnerVolumeSpecName "kube-api-access-krbvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:49:25.272941 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.272906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzg7r\" (UniqueName: \"kubernetes.io/projected/345c64b0-5e6b-4184-b796-ba834837175f-kube-api-access-gzg7r\") pod \"maas-controller-554ff88cc4-nb4wx\" (UID: \"345c64b0-5e6b-4184-b796-ba834837175f\") " pod="opendatahub/maas-controller-554ff88cc4-nb4wx" Apr 21 04:49:25.273117 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.272962 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-krbvs\" (UniqueName: \"kubernetes.io/projected/f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5-kube-api-access-krbvs\") on node \"ip-10-0-135-122.ec2.internal\" DevicePath \"\"" Apr 21 04:49:25.281765 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.281729 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzg7r\" (UniqueName: \"kubernetes.io/projected/345c64b0-5e6b-4184-b796-ba834837175f-kube-api-access-gzg7r\") pod \"maas-controller-554ff88cc4-nb4wx\" (UID: \"345c64b0-5e6b-4184-b796-ba834837175f\") " pod="opendatahub/maas-controller-554ff88cc4-nb4wx" Apr 21 04:49:25.337206 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.337172 2579 generic.go:358] "Generic (PLEG): container finished" podID="f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5" containerID="69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b" exitCode=0 Apr 21 04:49:25.337352 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.337241 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" Apr 21 04:49:25.337352 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.337258 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" event={"ID":"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5","Type":"ContainerDied","Data":"69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b"} Apr 21 04:49:25.337352 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.337298 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84fbf49c54-ccsc7" event={"ID":"f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5","Type":"ContainerDied","Data":"3cec40451d4bb976765aea6f28f2fe0de08ad9f7aab166a6480e199c9a06389e"} Apr 21 04:49:25.337352 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.337318 2579 scope.go:117] "RemoveContainer" containerID="69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b" Apr 21 04:49:25.346557 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.346538 2579 scope.go:117] "RemoveContainer" containerID="69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b" Apr 21 04:49:25.346824 ip-10-0-135-122 kubenswrapper[2579]: E0421 04:49:25.346805 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b\": container with ID starting with 69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b not found: ID does not exist" containerID="69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b" Apr 21 04:49:25.346881 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.346833 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b"} err="failed to get container status \"69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b\": rpc error: code = NotFound desc = could not find container \"69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b\": container with ID starting with 69c8cc8b087fe657b1b55904087ce8265e386c31f815c090beda1630f73cae4b not found: ID does not exist" Apr 21 04:49:25.355702 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.355683 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-554ff88cc4-nb4wx" Apr 21 04:49:25.361147 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.361118 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-84fbf49c54-ccsc7"] Apr 21 04:49:25.364751 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.364729 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-84fbf49c54-ccsc7"] Apr 21 04:49:25.485354 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.485322 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-554ff88cc4-nb4wx"] Apr 21 04:49:25.486910 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:49:25.486881 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod345c64b0_5e6b_4184_b796_ba834837175f.slice/crio-ee55edfe9ee3d738f992b00afe4a8187fedf75b4b35cb64a6152e1fb6f8da5ac WatchSource:0}: Error finding container ee55edfe9ee3d738f992b00afe4a8187fedf75b4b35cb64a6152e1fb6f8da5ac: Status 404 returned error can't find the container with id ee55edfe9ee3d738f992b00afe4a8187fedf75b4b35cb64a6152e1fb6f8da5ac Apr 21 04:49:25.898092 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:25.898062 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5" path="/var/lib/kubelet/pods/f4aa9ffc-c4ab-4a80-bbdf-a190d900f1b5/volumes" Apr 21 04:49:26.342734 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:26.342685 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-554ff88cc4-nb4wx" event={"ID":"345c64b0-5e6b-4184-b796-ba834837175f","Type":"ContainerStarted","Data":"ac989faf86b33093ce2cc1f81cb108df5d65c900c203feb7eda612ceebf4fac5"} Apr 21 04:49:26.342734 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:26.342725 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-554ff88cc4-nb4wx" event={"ID":"345c64b0-5e6b-4184-b796-ba834837175f","Type":"ContainerStarted","Data":"ee55edfe9ee3d738f992b00afe4a8187fedf75b4b35cb64a6152e1fb6f8da5ac"} Apr 21 04:49:26.343010 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:26.342765 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-554ff88cc4-nb4wx" Apr 21 04:49:26.366245 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:26.366195 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-554ff88cc4-nb4wx" podStartSLOduration=1.039755149 podStartE2EDuration="1.366180062s" podCreationTimestamp="2026-04-21 04:49:25 +0000 UTC" firstStartedPulling="2026-04-21 04:49:25.488265967 +0000 UTC m=+628.114942622" lastFinishedPulling="2026-04-21 04:49:25.814690876 +0000 UTC m=+628.441367535" observedRunningTime="2026-04-21 04:49:26.363865456 +0000 UTC m=+628.990542135" watchObservedRunningTime="2026-04-21 04:49:26.366180062 +0000 UTC m=+628.992856739" Apr 21 04:49:37.353654 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:49:37.353619 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-554ff88cc4-nb4wx" Apr 21 04:50:33.914481 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.914445 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp"] Apr 21 04:50:33.918399 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.918356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:33.921249 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.921224 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 04:50:33.922264 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.922240 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 04:50:33.922264 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.922250 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-pxf66\"" Apr 21 04:50:33.922448 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.922333 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 21 04:50:33.926812 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.926780 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp"] Apr 21 04:50:33.982080 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.982046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:33.982250 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.982094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:33.982250 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.982118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:33.982250 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.982165 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz776\" (UniqueName: \"kubernetes.io/projected/55c71c03-4c16-4a50-875c-db6da4399bc1-kube-api-access-jz776\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:33.982250 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.982228 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55c71c03-4c16-4a50-875c-db6da4399bc1-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:33.982506 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:33.982276 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083039 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.082991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.083063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.083103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083239 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.083222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083434 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.083261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz776\" (UniqueName: \"kubernetes.io/projected/55c71c03-4c16-4a50-875c-db6da4399bc1-kube-api-access-jz776\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083434 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.083320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55c71c03-4c16-4a50-875c-db6da4399bc1-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083552 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.083438 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083643 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.083616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.083898 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.083862 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.085794 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.085746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55c71c03-4c16-4a50-875c-db6da4399bc1-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.086233 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.086197 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55c71c03-4c16-4a50-875c-db6da4399bc1-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.098654 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.098633 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz776\" (UniqueName: \"kubernetes.io/projected/55c71c03-4c16-4a50-875c-db6da4399bc1-kube-api-access-jz776\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-4cplp\" (UID: \"55c71c03-4c16-4a50-875c-db6da4399bc1\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.230216 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.230127 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:34.373823 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.373797 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp"] Apr 21 04:50:34.374604 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:50:34.374575 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c71c03_4c16_4a50_875c_db6da4399bc1.slice/crio-8a2a8988af4d456ad439231f39b06a30370f35ff79f461ac306a167ef099e8f7 WatchSource:0}: Error finding container 8a2a8988af4d456ad439231f39b06a30370f35ff79f461ac306a167ef099e8f7: Status 404 returned error can't find the container with id 8a2a8988af4d456ad439231f39b06a30370f35ff79f461ac306a167ef099e8f7 Apr 21 04:50:34.621120 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:34.621069 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" event={"ID":"55c71c03-4c16-4a50-875c-db6da4399bc1","Type":"ContainerStarted","Data":"8a2a8988af4d456ad439231f39b06a30370f35ff79f461ac306a167ef099e8f7"} Apr 21 04:50:40.655348 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:40.655310 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" event={"ID":"55c71c03-4c16-4a50-875c-db6da4399bc1","Type":"ContainerStarted","Data":"50ff4b50bb19c7407118632dc0b0e50af2e409f896153639990702b2539a3c98"} Apr 21 04:50:45.678770 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:45.678739 2579 generic.go:358] "Generic (PLEG): container finished" podID="55c71c03-4c16-4a50-875c-db6da4399bc1" containerID="50ff4b50bb19c7407118632dc0b0e50af2e409f896153639990702b2539a3c98" exitCode=0 Apr 21 04:50:45.679149 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:45.678818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" event={"ID":"55c71c03-4c16-4a50-875c-db6da4399bc1","Type":"ContainerDied","Data":"50ff4b50bb19c7407118632dc0b0e50af2e409f896153639990702b2539a3c98"} Apr 21 04:50:47.689377 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:47.689329 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" event={"ID":"55c71c03-4c16-4a50-875c-db6da4399bc1","Type":"ContainerStarted","Data":"8a868975a9eba248c2603e030e76cb30b76d0cdd268616e867c90523b9c6d4fb"} Apr 21 04:50:47.689754 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:47.689580 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:50:47.710428 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:47.710356 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" podStartSLOduration=2.204405766 podStartE2EDuration="14.710342369s" podCreationTimestamp="2026-04-21 04:50:33 +0000 UTC" firstStartedPulling="2026-04-21 04:50:34.376373221 +0000 UTC m=+697.003049891" lastFinishedPulling="2026-04-21 04:50:46.882309825 +0000 UTC m=+709.508986494" observedRunningTime="2026-04-21 04:50:47.709039298 +0000 UTC m=+710.335715978" watchObservedRunningTime="2026-04-21 04:50:47.710342369 +0000 UTC m=+710.337019047" Apr 21 04:50:58.706215 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:50:58.706181 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-4cplp" Apr 21 04:52:33.167154 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:33.167081 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-9fwhf_9fb5f7b9-9184-474f-a0cf-7bde29f6547f/manager/0.log" Apr 21 04:52:33.408836 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:33.408809 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-554ff88cc4-nb4wx_345c64b0-5e6b-4184-b796-ba834837175f/manager/0.log" Apr 21 04:52:33.522977 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:33.522898 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-hgptw_ad901c4d-2781-4a4c-b833-73dd358da08d/manager/2.log" Apr 21 04:52:33.634097 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:33.634044 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-55ddb68486-mlwd6_caf15f45-4d35-43a2-af97-c5203c5e3bc5/manager/0.log" Apr 21 04:52:33.986047 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:33.986016 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-fjjjw_fa9ff1d9-e39b-4b32-9d13-1a6806799e4d/postgres/0.log" Apr 21 04:52:34.736931 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:34.736832 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h_89c8de52-d4f7-41e0-9564-c4998283fee0/util/0.log" Apr 21 04:52:34.749720 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:34.749696 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h_89c8de52-d4f7-41e0-9564-c4998283fee0/pull/0.log" Apr 21 04:52:34.761315 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:34.761286 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h_89c8de52-d4f7-41e0-9564-c4998283fee0/extract/0.log" Apr 21 04:52:34.870230 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:34.870205 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj_687ba120-5872-4868-8d82-71688d3868b5/util/0.log" Apr 21 04:52:34.877039 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:34.877014 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj_687ba120-5872-4868-8d82-71688d3868b5/pull/0.log" Apr 21 04:52:34.884554 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:34.884532 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj_687ba120-5872-4868-8d82-71688d3868b5/extract/0.log" Apr 21 04:52:34.990413 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:34.990321 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg_c02fdf1f-f348-49e7-b758-6a6122200fb8/util/0.log" Apr 21 04:52:34.997022 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:34.996998 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg_c02fdf1f-f348-49e7-b758-6a6122200fb8/pull/0.log" Apr 21 04:52:35.003418 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:35.003400 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg_c02fdf1f-f348-49e7-b758-6a6122200fb8/extract/0.log" Apr 21 04:52:35.106181 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:35.106151 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6_48c195b5-8367-4ab4-bc83-68ff346679da/util/0.log" Apr 21 04:52:35.112820 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:35.112800 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6_48c195b5-8367-4ab4-bc83-68ff346679da/pull/0.log" Apr 21 04:52:35.122251 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:35.122233 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6_48c195b5-8367-4ab4-bc83-68ff346679da/extract/0.log" Apr 21 04:52:35.355052 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:35.355021 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-4dfgn_1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13/manager/0.log" Apr 21 04:52:36.483478 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:36.483449 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-wskqn_fd17ffd0-3d66-4970-b3f0-43338283c480/discovery/0.log" Apr 21 04:52:36.690912 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:36.690882 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-788fdfdbbd-6df9j_98930cc8-217b-42db-b1f2-816573cc740a/kube-auth-proxy/0.log" Apr 21 04:52:37.497280 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:37.497251 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-4cplp_55c71c03-4c16-4a50-875c-db6da4399bc1/storage-initializer/0.log" Apr 21 04:52:37.504535 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:37.504510 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-4cplp_55c71c03-4c16-4a50-875c-db6da4399bc1/main/0.log" Apr 21 04:52:44.480753 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:44.480722 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6rnkb_533e48e5-7652-4081-aa24-2f0eaed21d14/global-pull-secret-syncer/0.log" Apr 21 04:52:44.557144 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:44.557104 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-djdfl_c3250d03-5ab0-4acf-8145-601ce40b14a2/konnectivity-agent/0.log" Apr 21 04:52:44.615339 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:44.615304 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-122.ec2.internal_22c8428fe5241945337c215cc12a9733/haproxy/0.log" Apr 21 04:52:48.618411 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.618354 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h_89c8de52-d4f7-41e0-9564-c4998283fee0/extract/0.log" Apr 21 04:52:48.641932 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.641895 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h_89c8de52-d4f7-41e0-9564-c4998283fee0/util/0.log" Apr 21 04:52:48.668392 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.668344 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759wsb5h_89c8de52-d4f7-41e0-9564-c4998283fee0/pull/0.log" Apr 21 04:52:48.699067 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.699008 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj_687ba120-5872-4868-8d82-71688d3868b5/extract/0.log" Apr 21 04:52:48.722162 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.722137 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj_687ba120-5872-4868-8d82-71688d3868b5/util/0.log" Apr 21 04:52:48.750728 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.750683 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mskbj_687ba120-5872-4868-8d82-71688d3868b5/pull/0.log" Apr 21 04:52:48.785008 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.784977 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg_c02fdf1f-f348-49e7-b758-6a6122200fb8/extract/0.log" Apr 21 04:52:48.811266 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.811234 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg_c02fdf1f-f348-49e7-b758-6a6122200fb8/util/0.log" Apr 21 04:52:48.835790 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.835762 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73hnnlg_c02fdf1f-f348-49e7-b758-6a6122200fb8/pull/0.log" Apr 21 04:52:48.863160 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.863136 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6_48c195b5-8367-4ab4-bc83-68ff346679da/extract/0.log" Apr 21 04:52:48.885606 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.885538 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6_48c195b5-8367-4ab4-bc83-68ff346679da/util/0.log" Apr 21 04:52:48.915146 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.915115 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1zbmh6_48c195b5-8367-4ab4-bc83-68ff346679da/pull/0.log" Apr 21 04:52:48.984863 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:48.984834 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-4dfgn_1e3f3eb9-cd95-443f-a8e6-5297ac3a8b13/manager/0.log" Apr 21 04:52:50.907020 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:50.906988 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d77nd_380f7265-1a51-467a-9169-1011757d613d/kube-state-metrics/0.log" Apr 21 04:52:50.930924 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:50.930900 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d77nd_380f7265-1a51-467a-9169-1011757d613d/kube-rbac-proxy-main/0.log" Apr 21 04:52:50.955542 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:50.955495 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d77nd_380f7265-1a51-467a-9169-1011757d613d/kube-rbac-proxy-self/0.log" Apr 21 04:52:50.988324 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:50.988300 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-599ff764cd-p45c5_115c3833-f05a-48f9-a403-7f41408d8114/metrics-server/0.log" Apr 21 04:52:51.013158 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.013130 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-474k4_a1eaa9fb-25fe-4e1f-a838-4fcbae8a5c23/monitoring-plugin/0.log" Apr 21 04:52:51.205557 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.205474 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-trq4k_730b7810-de68-4798-9079-e3cdd2121300/node-exporter/0.log" Apr 21 04:52:51.228626 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.228592 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-trq4k_730b7810-de68-4798-9079-e3cdd2121300/kube-rbac-proxy/0.log" Apr 21 04:52:51.249934 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.249913 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-trq4k_730b7810-de68-4798-9079-e3cdd2121300/init-textfile/0.log" Apr 21 04:52:51.539799 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.539719 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kk7nd_ec960f2e-772b-471f-96bb-a0d4b9ff4f15/prometheus-operator/0.log" Apr 21 04:52:51.561194 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.561170 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kk7nd_ec960f2e-772b-471f-96bb-a0d4b9ff4f15/kube-rbac-proxy/0.log" Apr 21 04:52:51.614209 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.614181 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f67867bb5-f4rv6_0fd5110b-1359-44b2-ba72-c15680c47476/telemeter-client/0.log" Apr 21 04:52:51.652972 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.652944 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f67867bb5-f4rv6_0fd5110b-1359-44b2-ba72-c15680c47476/reload/0.log" Apr 21 04:52:51.674480 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:51.674459 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f67867bb5-f4rv6_0fd5110b-1359-44b2-ba72-c15680c47476/kube-rbac-proxy/0.log" Apr 21 04:52:52.864561 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.864524 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg"] Apr 21 04:52:52.868004 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.867980 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:52.870854 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.870830 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tffw4\"/\"kube-root-ca.crt\"" Apr 21 04:52:52.870972 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.870902 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tffw4\"/\"openshift-service-ca.crt\"" Apr 21 04:52:52.872026 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.872000 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tffw4\"/\"default-dockercfg-49xnb\"" Apr 21 04:52:52.876458 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.876434 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg"] Apr 21 04:52:52.972134 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.972100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-podres\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:52.972315 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.972161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rkm5\" (UniqueName: \"kubernetes.io/projected/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-kube-api-access-4rkm5\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:52.972315 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.972293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-lib-modules\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:52.972434 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.972349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-proc\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:52.972434 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.972398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-sys\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:52.986413 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:52.986353 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-vx96n_5889b438-5ad7-4587-ad98-78b9ed6b52a5/networking-console-plugin/0.log" Apr 21 04:52:53.073300 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-podres\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.073478 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rkm5\" (UniqueName: \"kubernetes.io/projected/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-kube-api-access-4rkm5\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.073478 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073435 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-lib-modules\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.073478 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073437 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-podres\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.073646 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073550 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-proc\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.073646 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073565 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-lib-modules\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.073646 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-sys\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.073646 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073629 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-proc\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.073791 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.073688 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-sys\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.082511 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.082487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rkm5\" (UniqueName: \"kubernetes.io/projected/3e89c6d1-2cf4-4443-9df6-94219b73f0b3-kube-api-access-4rkm5\") pod \"perf-node-gather-daemonset-6fxcg\" (UID: \"3e89c6d1-2cf4-4443-9df6-94219b73f0b3\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.179533 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.179443 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:53.308014 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:53.307985 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg"] Apr 21 04:52:53.308809 ip-10-0-135-122 kubenswrapper[2579]: W0421 04:52:53.308776 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3e89c6d1_2cf4_4443_9df6_94219b73f0b3.slice/crio-d8230b27f3894c6eba342165d9652f1566a69c1e4fe4132450abfc2024f41ab2 WatchSource:0}: Error finding container d8230b27f3894c6eba342165d9652f1566a69c1e4fe4132450abfc2024f41ab2: Status 404 returned error can't find the container with id d8230b27f3894c6eba342165d9652f1566a69c1e4fe4132450abfc2024f41ab2 Apr 21 04:52:54.003289 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:54.003262 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59f6d74895-724gv_fee2d97b-4b30-43a9-8e85-2333a013d782/console/0.log" Apr 21 04:52:54.034422 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:54.034390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7wgpt_5ce78fb5-83b3-4d9c-8aeb-db5f4df5abb5/download-server/0.log" Apr 21 04:52:54.222222 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:54.222190 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" event={"ID":"3e89c6d1-2cf4-4443-9df6-94219b73f0b3","Type":"ContainerStarted","Data":"8f13b0b54c17da0bd7767c48bbd7d45c0fa0f1f8b569f047ff5853d493dbd875"} Apr 21 04:52:54.222222 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:54.222227 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" event={"ID":"3e89c6d1-2cf4-4443-9df6-94219b73f0b3","Type":"ContainerStarted","Data":"d8230b27f3894c6eba342165d9652f1566a69c1e4fe4132450abfc2024f41ab2"} Apr 21 04:52:54.222481 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:54.222265 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:52:54.241192 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:54.241150 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" podStartSLOduration=2.241138864 podStartE2EDuration="2.241138864s" podCreationTimestamp="2026-04-21 04:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:52:54.238008746 +0000 UTC m=+836.864685424" watchObservedRunningTime="2026-04-21 04:52:54.241138864 +0000 UTC m=+836.867815542" Apr 21 04:52:54.547542 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:54.547502 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-pv9h6_3848cc69-2658-432a-9bc4-45e27bb60167/volume-data-source-validator/0.log" Apr 21 04:52:55.477165 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:55.477138 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vk48w_56593aaa-f779-4c98-94da-5b75ed6e9124/dns/0.log" Apr 21 04:52:55.502913 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:55.502879 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vk48w_56593aaa-f779-4c98-94da-5b75ed6e9124/kube-rbac-proxy/0.log" Apr 21 04:52:55.526694 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:55.526666 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-db4v4_b54dd57a-4c1d-4f99-a559-3e4be3f7266f/dns-node-resolver/0.log" Apr 21 04:52:56.029716 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:56.029687 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-66898969c-9dljf_b3ea294c-9c51-4fc3-a684-cce4c126b2a3/registry/0.log" Apr 21 04:52:56.051350 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:56.051326 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5x8r5_43feefbe-ff70-4e7a-8ad0-1791e41e4c6c/node-ca/0.log" Apr 21 04:52:56.980054 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:56.980021 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-wskqn_fd17ffd0-3d66-4970-b3f0-43338283c480/discovery/0.log" Apr 21 04:52:57.030381 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:57.030338 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-788fdfdbbd-6df9j_98930cc8-217b-42db-b1f2-816573cc740a/kube-auth-proxy/0.log" Apr 21 04:52:57.716580 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:57.716548 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-j8jtk_f8e3ff29-fd8e-4e1d-a4ed-a25ba7dbd7d2/serve-healthcheck-canary/0.log" Apr 21 04:52:58.446080 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:58.446053 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zwvsk_1202724e-b9b9-4a9b-893c-a0fd11838120/kube-rbac-proxy/0.log" Apr 21 04:52:58.468201 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:58.468175 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zwvsk_1202724e-b9b9-4a9b-893c-a0fd11838120/exporter/0.log" Apr 21 04:52:58.490801 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:52:58.490779 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zwvsk_1202724e-b9b9-4a9b-893c-a0fd11838120/extractor/0.log" Apr 21 04:53:00.236058 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:00.236034 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-6fxcg" Apr 21 04:53:00.467980 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:00.467948 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-9fwhf_9fb5f7b9-9184-474f-a0cf-7bde29f6547f/manager/0.log" Apr 21 04:53:00.517942 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:00.517867 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-554ff88cc4-nb4wx_345c64b0-5e6b-4184-b796-ba834837175f/manager/0.log" Apr 21 04:53:00.540021 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:00.539992 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-hgptw_ad901c4d-2781-4a4c-b833-73dd358da08d/manager/1.log" Apr 21 04:53:00.551122 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:00.551095 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-hgptw_ad901c4d-2781-4a4c-b833-73dd358da08d/manager/2.log" Apr 21 04:53:00.573378 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:00.573348 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-55ddb68486-mlwd6_caf15f45-4d35-43a2-af97-c5203c5e3bc5/manager/0.log" Apr 21 04:53:00.675671 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:00.675645 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-fjjjw_fa9ff1d9-e39b-4b32-9d13-1a6806799e4d/postgres/0.log" Apr 21 04:53:02.161647 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:02.161623 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-28rj6_fd580cbc-c1d8-40b7-8b5e-701fe02ac604/openshift-lws-operator/0.log" Apr 21 04:53:06.930132 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:06.930099 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-s5bdm_30ec7bfa-4b0d-470f-912f-87600811562b/kube-storage-version-migrator-operator/1.log" Apr 21 04:53:06.930875 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:06.930854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-s5bdm_30ec7bfa-4b0d-470f-912f-87600811562b/kube-storage-version-migrator-operator/0.log" Apr 21 04:53:07.960379 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:07.960334 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvz8m_506082c4-3364-48e7-a27f-927f2729dde4/kube-multus-additional-cni-plugins/0.log" Apr 21 04:53:07.983049 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:07.983026 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvz8m_506082c4-3364-48e7-a27f-927f2729dde4/egress-router-binary-copy/0.log" Apr 21 04:53:08.005167 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:08.005140 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvz8m_506082c4-3364-48e7-a27f-927f2729dde4/cni-plugins/0.log" Apr 21 04:53:08.028814 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:08.028787 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvz8m_506082c4-3364-48e7-a27f-927f2729dde4/bond-cni-plugin/0.log" Apr 21 04:53:08.051786 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:08.051755 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvz8m_506082c4-3364-48e7-a27f-927f2729dde4/routeoverride-cni/0.log" Apr 21 04:53:08.074035 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:08.074008 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvz8m_506082c4-3364-48e7-a27f-927f2729dde4/whereabouts-cni-bincopy/0.log" Apr 21 04:53:08.096376 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:08.096347 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvz8m_506082c4-3364-48e7-a27f-927f2729dde4/whereabouts-cni/0.log" Apr 21 04:53:08.526151 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:08.526121 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wkps9_e5cf2a49-609f-4790-abef-7cf1ee58cdbc/kube-multus/0.log" Apr 21 04:53:08.624319 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:08.624289 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jxwc7_e5103329-ae63-4574-9dcc-140804f95f79/network-metrics-daemon/0.log" Apr 21 04:53:08.656903 ip-10-0-135-122 kubenswrapper[2579]: I0421 04:53:08.656876 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jxwc7_e5103329-ae63-4574-9dcc-140804f95f79/kube-rbac-proxy/0.log"