Apr 20 19:18:08.229305 ip-10-0-133-149 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 19:18:08.229316 ip-10-0-133-149 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 19:18:08.229324 ip-10-0-133-149 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 19:18:08.229530 ip-10-0-133-149 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 19:18:18.272745 ip-10-0-133-149 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 19:18:18.272763 ip-10-0-133-149 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6f303defc75d48cf89eb6812a153e387 -- Apr 20 19:20:35.162016 ip-10-0-133-149 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:20:35.608424 ip-10-0-133-149 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:35.608424 ip-10-0-133-149 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:20:35.608424 ip-10-0-133-149 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:35.608424 ip-10-0-133-149 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:20:35.608424 ip-10-0-133-149 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:35.611024 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.610913 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:20:35.613809 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613795 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:35.613809 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613809 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613813 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613817 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613819 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613822 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613825 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613831 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613834 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613837 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613840 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613843 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613845 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613848 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613851 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613853 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613856 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613859 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613861 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613864 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613866 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:35.613872 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613869 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613871 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613874 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613877 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613880 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613882 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613885 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613888 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613891 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613893 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613896 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613898 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613901 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613905 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613908 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613911 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613914 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613916 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613919 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:35.614360 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613922 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613924 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613926 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613929 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613931 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613934 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613936 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613939 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613941 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613943 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613946 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613949 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613952 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613955 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613958 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613961 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613964 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613966 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613969 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613971 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:35.614869 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613974 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613977 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613979 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613982 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613984 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613987 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613990 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613992 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613995 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.613997 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614000 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614002 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614005 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614008 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614010 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614013 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614018 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614021 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614024 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614027 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:35.615363 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614030 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614033 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614036 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614039 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614042 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614045 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614416 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614424 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614427 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614430 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614433 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614436 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614439 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614441 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614445 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614449 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614452 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614455 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614458 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:35.615861 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614461 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614464 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614467 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614470 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614473 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614475 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614479 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614482 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614484 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614487 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614490 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614493 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614495 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614498 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614500 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614503 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614505 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614508 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614510 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614513 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:35.616322 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614516 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614519 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614521 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614524 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614527 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614530 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614532 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614535 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614537 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614540 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614543 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614545 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614548 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614550 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614553 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614555 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614558 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614560 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614563 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614566 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:35.616839 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614568 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614570 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614573 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614575 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614577 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614580 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614583 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614585 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614588 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614590 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614593 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614596 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614599 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614602 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614604 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614607 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614610 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614612 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614615 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614617 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:35.617334 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614619 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614622 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614624 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614627 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614630 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614633 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614635 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614638 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614640 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614643 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614645 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614647 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.614650 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615898 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615912 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615924 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615928 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615933 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615936 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615941 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615945 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:20:35.617831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615949 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615952 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615955 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615959 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615962 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615965 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615968 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615971 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615974 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615977 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615980 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615984 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615986 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615989 2577 flags.go:64] FLAG: --config-dir="" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615992 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615996 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.615999 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616002 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616005 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616008 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616011 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616014 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616017 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616020 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616023 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:20:35.618591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616028 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616031 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616034 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616037 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616040 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616045 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616050 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616053 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616055 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616058 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616061 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616066 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616069 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616072 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616074 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616077 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616080 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616083 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616087 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616089 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616092 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616095 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616099 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616101 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616104 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:20:35.619314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616107 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616110 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616113 2577 flags.go:64] FLAG: --help="false" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616117 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-133-149.ec2.internal" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616120 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616123 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616125 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616130 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616133 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616136 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616139 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616142 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616147 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616150 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616153 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616156 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616158 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616161 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616164 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616167 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616170 2577 flags.go:64] FLAG: --lock-file="" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616172 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616175 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616178 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:20:35.619969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616184 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616186 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616189 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616192 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616195 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616199 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616201 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616204 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616208 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616211 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616215 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616218 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616221 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616224 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616227 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616230 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616233 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616236 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616244 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616247 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616252 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616255 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616257 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:20:35.620545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616263 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616266 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616269 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616272 2577 flags.go:64] FLAG: --port="10250" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616275 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616277 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05dafd2d968f89f15" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616280 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616283 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616286 2577 flags.go:64] FLAG: --register-node="true" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616289 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616292 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616296 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616299 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616302 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616304 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616308 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616311 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616314 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616316 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616319 2577 flags.go:64] FLAG: --runonce="false" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616322 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616325 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616328 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616331 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616334 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616337 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:20:35.621161 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616341 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616344 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616347 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616351 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616354 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616356 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616359 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616362 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616365 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616371 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616374 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616376 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616381 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616383 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616386 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616389 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616392 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616395 2577 flags.go:64] FLAG: --v="2" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616399 2577 flags.go:64] FLAG: --version="false" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616403 2577 flags.go:64] FLAG: --vmodule="" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616407 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.616410 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616504 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616508 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:35.621836 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616511 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616514 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616517 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616520 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616523 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616527 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616530 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616533 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616536 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616540 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616542 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616547 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616550 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616553 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616555 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616558 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616561 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616564 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616566 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616569 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:35.622430 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616572 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616575 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616577 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616580 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616583 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616586 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616589 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616591 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616594 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616596 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616598 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616601 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616604 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616606 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616608 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616611 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616614 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616616 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616619 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:35.622998 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616621 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616625 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616629 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616632 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616634 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616637 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616639 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616642 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616644 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616647 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616649 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616652 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616654 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616657 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616659 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616661 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616664 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616666 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616669 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616671 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:35.623454 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616674 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616676 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616678 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616681 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616683 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616686 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616688 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616690 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616693 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616699 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616701 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616704 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616706 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616709 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616711 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616714 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616720 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616734 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616737 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616739 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:35.623963 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616742 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:35.624449 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616745 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:35.624449 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616747 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:35.624449 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616750 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:35.624449 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.616752 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:35.624449 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.617408 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:35.625918 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.625900 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:20:35.625965 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.625919 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625969 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625974 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625977 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625980 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625982 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625986 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625988 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625991 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625993 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.625996 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626000 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:35.625996 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626002 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626005 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626008 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626011 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626013 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626016 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626019 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626022 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626024 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626027 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626029 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626032 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626035 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626037 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626040 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626042 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626045 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626047 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626050 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:35.626301 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626055 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626057 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626060 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626063 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626065 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626068 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626070 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626073 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626076 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626078 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626080 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626083 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626086 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626088 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626092 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626096 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626099 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626102 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626104 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:35.626788 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626107 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626111 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626115 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626118 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626121 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626124 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626127 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626130 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626133 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626136 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626139 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626141 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626144 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626146 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626150 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626152 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626155 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626158 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626160 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:35.627241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626162 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626165 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626168 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626170 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626173 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626175 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626178 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626180 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626183 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626186 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626189 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626192 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626194 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626196 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626199 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626201 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626203 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:35.627868 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626206 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.626211 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626318 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626324 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626327 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626329 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626332 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626334 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626337 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626340 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626342 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626345 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626347 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626350 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626352 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626355 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:35.628296 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626357 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626360 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626362 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626365 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626370 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626373 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626375 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626377 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626381 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626384 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626387 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626389 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626392 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626394 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626397 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626399 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626402 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626404 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626407 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:35.628690 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626410 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626412 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626415 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626417 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626420 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626422 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626425 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626428 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626430 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626432 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626435 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626437 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626440 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626442 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626445 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626447 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626450 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626452 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626455 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626457 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:35.629154 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626459 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626463 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626467 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626470 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626473 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626476 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626479 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626481 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626484 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626486 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626489 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626491 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626494 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626496 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626499 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626501 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626504 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626506 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626509 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626511 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:35.629636 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626514 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626516 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626519 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626521 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626524 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626526 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626529 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626531 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626534 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626536 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626539 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626541 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:35.626543 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.626549 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:35.630133 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.627149 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:20:35.632076 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.632062 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:20:35.633151 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.633139 2577 server.go:1019] "Starting client certificate rotation" Apr 20 19:20:35.633264 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.633246 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:20:35.633621 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.633608 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:20:35.661581 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.661561 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:20:35.664303 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.664266 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:20:35.680511 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.680490 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:20:35.686659 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.686644 2577 log.go:25] "Validated CRI v1 image API" Apr 20 19:20:35.687993 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.687974 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:20:35.692217 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.692199 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ea5cc91b-026c-46ff-a214-5dc4b48e8da2:/dev/nvme0n1p3 eaa6a64c-d90b-472b-825c-3e8398b48aee:/dev/nvme0n1p4] Apr 20 19:20:35.692282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.692216 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:20:35.697399 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.697382 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:20:35.697485 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.697347 2577 manager.go:217] Machine: {Timestamp:2026-04-20 19:20:35.695772375 +0000 UTC m=+0.410144454 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3238377 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a6c6305773bb8049e0815110e7294 SystemUUID:ec2a6c63-0577-3bb8-049e-0815110e7294 BootID:6f303def-c75d-48cf-89eb-6812a153e387 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d9:61:1e:9b:ed Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d9:61:1e:9b:ed Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:92:f4:49:5d:d0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:20:35.697485 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.697483 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:20:35.697575 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.697561 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:20:35.700226 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.700200 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:20:35.700359 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.700229 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-149.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:20:35.700400 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.700367 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:20:35.700400 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.700376 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:20:35.700400 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.700393 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:20:35.700473 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.700403 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:20:35.701547 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.701536 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:20:35.701815 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.701805 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:20:35.704072 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.704062 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:20:35.704104 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.704098 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:20:35.704131 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.704110 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:20:35.704131 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.704120 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:20:35.704131 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.704128 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:20:35.705117 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.705103 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:20:35.705186 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.705122 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:20:35.708935 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.708920 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:20:35.710012 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.709999 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:20:35.711871 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711852 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:20:35.711871 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711872 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711878 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711884 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711892 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711901 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711918 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711924 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711931 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711937 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711950 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:20:35.711968 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.711959 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:20:35.713478 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.713467 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:20:35.713532 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.713480 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:20:35.717017 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.717001 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:20:35.717110 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.717033 2577 server.go:1295] "Started kubelet" Apr 20 19:20:35.717110 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.717099 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-149.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 19:20:35.717224 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.717113 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-149.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 19:20:35.717224 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.717154 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:20:35.717224 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.717148 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:20:35.717224 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.717209 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:20:35.717497 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.717164 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 19:20:35.717850 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.717828 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5t788" Apr 20 19:20:35.717893 ip-10-0-133-149 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:20:35.718101 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.718082 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:20:35.723714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.723675 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:20:35.725457 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.725434 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5t788" Apr 20 19:20:35.729423 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.729404 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 19:20:35.730129 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.730104 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:20:35.730592 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.730575 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:20:35.731170 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731153 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:20:35.731170 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731144 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:20:35.731332 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731177 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:20:35.731332 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.731285 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:35.731332 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731320 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:20:35.731332 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731330 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:20:35.731460 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731336 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:20:35.731460 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731348 2577 factory.go:55] Registering systemd factory Apr 20 19:20:35.731460 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731356 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:20:35.731596 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731571 2577 factory.go:153] Registering CRI-O factory Apr 20 19:20:35.731640 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731600 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 19:20:35.731640 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731619 2577 factory.go:103] Registering Raw factory Apr 20 19:20:35.731640 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731629 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 19:20:35.731979 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.731965 2577 manager.go:319] Starting recovery of all containers Apr 20 19:20:35.739844 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.739708 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:35.742719 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.742701 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-149.ec2.internal\" not found" node="ip-10-0-133-149.ec2.internal" Apr 20 19:20:35.743010 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.742999 2577 manager.go:324] Recovery completed Apr 20 19:20:35.746817 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.746800 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:35.750712 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.750699 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:35.750789 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.750743 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:35.750789 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.750754 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:35.751210 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.751196 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:20:35.751250 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.751210 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:20:35.751250 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.751246 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:20:35.753271 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.753260 2577 policy_none.go:49] "None policy: Start" Apr 20 19:20:35.753312 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.753276 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:20:35.753312 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.753286 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:20:35.786280 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.786260 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.786303 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.786317 2577 server.go:85] "Starting device plugin registration server" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.786542 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.786554 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.786632 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.786704 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.786713 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.787231 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:20:35.801945 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.787268 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:35.828861 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.828841 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:20:35.830026 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.830007 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:20:35.830095 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.830038 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:20:35.830095 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.830072 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:20:35.830095 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.830078 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:20:35.830190 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.830112 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:20:35.834608 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.834590 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:35.887168 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.887124 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:35.887949 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.887926 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:35.888029 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.887956 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:35.888029 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.887967 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:35.888029 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.887992 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-149.ec2.internal" Apr 20 19:20:35.898774 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.898757 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-149.ec2.internal" Apr 20 19:20:35.898820 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.898780 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-149.ec2.internal\": node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:35.912720 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.912701 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:35.930962 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.930943 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal"] Apr 20 19:20:35.931010 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.931000 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:35.931757 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.931744 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:35.931815 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.931768 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:35.931815 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.931778 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:35.932903 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.932891 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:35.933056 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.933044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:35.933095 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.933071 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:35.933599 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.933580 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:35.933668 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.933605 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:35.933668 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.933615 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:35.933668 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.933582 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:35.933780 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.933688 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:35.933780 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.933705 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:35.934823 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.934810 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" Apr 20 19:20:35.934868 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.934833 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:35.935654 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.935638 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:35.935712 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.935669 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:35.935712 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:35.935685 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:35.951247 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.951230 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-149.ec2.internal\" not found" node="ip-10-0-133-149.ec2.internal" Apr 20 19:20:35.954917 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:35.954901 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-149.ec2.internal\" not found" node="ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.013596 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.013575 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.032151 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.032128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cfb13cf22483fad0841f9bb06885f79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal\" (UID: \"7cfb13cf22483fad0841f9bb06885f79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.032226 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.032155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cfb13cf22483fad0841f9bb06885f79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal\" (UID: \"7cfb13cf22483fad0841f9bb06885f79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.114311 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.114290 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.132598 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.132580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cfb13cf22483fad0841f9bb06885f79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal\" (UID: \"7cfb13cf22483fad0841f9bb06885f79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.132664 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.132649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cfb13cf22483fad0841f9bb06885f79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal\" (UID: \"7cfb13cf22483fad0841f9bb06885f79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.132664 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.132656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cfb13cf22483fad0841f9bb06885f79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal\" (UID: \"7cfb13cf22483fad0841f9bb06885f79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.132769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.132673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cfb13cf22483fad0841f9bb06885f79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal\" (UID: \"7cfb13cf22483fad0841f9bb06885f79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.132769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.132689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/562c22904357368d150bcfb5b4deac02-config\") pod \"kube-apiserver-proxy-ip-10-0-133-149.ec2.internal\" (UID: \"562c22904357368d150bcfb5b4deac02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.215030 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.214983 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.233299 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.233271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/562c22904357368d150bcfb5b4deac02-config\") pod \"kube-apiserver-proxy-ip-10-0-133-149.ec2.internal\" (UID: \"562c22904357368d150bcfb5b4deac02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.233382 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.233329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/562c22904357368d150bcfb5b4deac02-config\") pod \"kube-apiserver-proxy-ip-10-0-133-149.ec2.internal\" (UID: \"562c22904357368d150bcfb5b4deac02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.253410 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.253395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.257798 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.257784 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" Apr 20 19:20:36.315408 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.315385 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.415851 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.415828 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.516293 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.516272 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.526560 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.526544 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:36.617044 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.617021 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.633343 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.633329 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:20:36.633442 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.633426 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:20:36.633491 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.633474 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:20:36.633523 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.633475 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:20:36.718014 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.717982 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.728160 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.728119 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:15:35 +0000 UTC" deadline="2027-12-09 12:00:22.069201456 +0000 UTC" Apr 20 19:20:36.728160 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.728156 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14344h39m45.341048131s" Apr 20 19:20:36.731069 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.731051 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:20:36.744620 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.744591 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:20:36.813568 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:36.813534 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cfb13cf22483fad0841f9bb06885f79.slice/crio-b62aea444685ff1fe1c6be854314344bfa1c0e97903bce4785cc85b9e92cb0d4 WatchSource:0}: Error finding container b62aea444685ff1fe1c6be854314344bfa1c0e97903bce4785cc85b9e92cb0d4: Status 404 returned error can't find the container with id b62aea444685ff1fe1c6be854314344bfa1c0e97903bce4785cc85b9e92cb0d4 Apr 20 19:20:36.816484 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:36.816458 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562c22904357368d150bcfb5b4deac02.slice/crio-afe560e94713e0515463662639405d6f16b471faa516c6f3169c14fdee3a41a9 WatchSource:0}: Error finding container afe560e94713e0515463662639405d6f16b471faa516c6f3169c14fdee3a41a9: Status 404 returned error can't find the container with id afe560e94713e0515463662639405d6f16b471faa516c6f3169c14fdee3a41a9 Apr 20 19:20:36.818607 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.818587 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.820543 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.820530 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:20:36.833163 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.833119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" event={"ID":"7cfb13cf22483fad0841f9bb06885f79","Type":"ContainerStarted","Data":"b62aea444685ff1fe1c6be854314344bfa1c0e97903bce4785cc85b9e92cb0d4"} Apr 20 19:20:36.834060 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.834042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" event={"ID":"562c22904357368d150bcfb5b4deac02","Type":"ContainerStarted","Data":"afe560e94713e0515463662639405d6f16b471faa516c6f3169c14fdee3a41a9"} Apr 20 19:20:36.919368 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:36.919344 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-149.ec2.internal\" not found" Apr 20 19:20:36.924167 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.924152 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-d52vr" Apr 20 19:20:36.935085 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.935067 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-d52vr" Apr 20 19:20:36.990614 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:36.990593 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:37.031258 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.031238 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" Apr 20 19:20:37.050903 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.050880 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:20:37.051789 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.051747 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" Apr 20 19:20:37.067461 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.067444 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:20:37.685360 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.685307 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:37.704834 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.704800 2577 apiserver.go:52] "Watching apiserver" Apr 20 19:20:37.715426 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.715405 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:20:37.715947 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.715913 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lnjzz","openshift-image-registry/node-ca-rnzz5","openshift-multus/multus-additional-cni-plugins-k75h7","openshift-multus/multus-fdm6h","openshift-multus/network-metrics-daemon-tssws","openshift-network-diagnostics/network-check-target-nnp2z","openshift-ovn-kubernetes/ovnkube-node-rhxmj","kube-system/konnectivity-agent-gzt79","kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz","openshift-cluster-node-tuning-operator/tuned-6774v","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal","openshift-network-operator/iptables-alerter-8pstc","kube-system/global-pull-secret-syncer-9r8xq"] Apr 20 19:20:37.718668 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.718645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.720872 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.720846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.721461 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.721438 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:20:37.721531 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.721477 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:20:37.721531 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.721488 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:20:37.721654 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.721640 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:20:37.721714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.721698 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:20:37.722359 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.722226 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s26cl\"" Apr 20 19:20:37.722359 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.722313 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:20:37.722643 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.722624 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:20:37.723207 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.722977 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:37.723486 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.723466 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:20:37.723584 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.723563 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-w4wnq\"" Apr 20 19:20:37.723692 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.723657 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:20:37.725090 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.724797 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:20:37.725090 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.724878 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:20:37.725090 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.725068 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d95rz\"" Apr 20 19:20:37.725391 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.725373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.727186 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.727163 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:20:37.727707 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.727528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kmwq7\"" Apr 20 19:20:37.727707 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.727544 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:20:37.727707 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.727532 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:20:37.727707 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.727688 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:20:37.727977 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.727904 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:37.728066 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.728039 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:37.730151 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.730134 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:37.730325 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.730305 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:37.732551 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.732535 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.734719 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.734525 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:20:37.736463 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.735300 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:20:37.736463 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.735559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.736463 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.735826 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6gmb5\"" Apr 20 19:20:37.739322 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.739151 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.739869 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.739709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8372aa91-c5a0-4714-939b-8dc6743d0b72-konnectivity-ca\") pod \"konnectivity-agent-gzt79\" (UID: \"8372aa91-c5a0-4714-939b-8dc6743d0b72\") " pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:37.739869 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.739789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-cni-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.739869 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.739823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-slash\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.740082 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.739857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-var-lib-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.740082 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.739926 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-cni-netd\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.740192 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-lib-modules\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.740192 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bljdg\" (UniqueName: \"kubernetes.io/projected/2556de9e-929f-44f6-9c30-d010ac805c34-kube-api-access-bljdg\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.740297 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740236 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-cnibin\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.740297 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-daemon-config\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.740398 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740327 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvnml\" (UniqueName: \"kubernetes.io/projected/92c4c570-25df-4201-b0cf-3fc5e5d442d8-kube-api-access-zvnml\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.740398 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-kubelet\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysconfig\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-multus-certs\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6sv\" (UniqueName: \"kubernetes.io/projected/ec1d5da3-6144-4314-be21-f06f578325c6-kube-api-access-ss6sv\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-modprobe-d\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysctl-conf\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-netns\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-node-log\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-662kh\" (UniqueName: \"kubernetes.io/projected/5a91163e-e923-41e4-98ab-9b9dc9d412b6-kube-api-access-662kh\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-etc-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.740953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-ovn\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.741062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.741002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-host\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.741616 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.741410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysctl-d\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.741616 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.741486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2556de9e-929f-44f6-9c30-d010ac805c34-iptables-alerter-script\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.741801 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.741766 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-cni-multus\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.741886 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.741816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-systemd\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.741942 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.741881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-cni-bin\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.742209 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.742144 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:20:37.742311 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.742279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.742553 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.742471 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-hblxf\"" Apr 20 19:20:37.742643 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.742573 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:20:37.742955 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.742868 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:20:37.742955 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.742890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:20:37.742955 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.742899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-ovnkube-config\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.742955 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.742952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec1d5da3-6144-4314-be21-f06f578325c6-ovn-node-metrics-cert\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.743194 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k47xq\" (UniqueName: \"kubernetes.io/projected/39c06111-8b7a-4d9f-a3de-f5c655ac387d-kube-api-access-k47xq\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:37.743194 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743054 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:20:37.743319 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743291 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:20:37.743401 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743322 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2556de9e-929f-44f6-9c30-d010ac805c34-host-slash\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.743401 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-run-netns\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.743542 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743407 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7lzxz\"" Apr 20 19:20:37.743542 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.743542 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-systemd\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.743542 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a91163e-e923-41e4-98ab-9b9dc9d412b6-tmp\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.743804 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8372aa91-c5a0-4714-939b-8dc6743d0b72-agent-certs\") pod \"konnectivity-agent-gzt79\" (UID: \"8372aa91-c5a0-4714-939b-8dc6743d0b72\") " pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:37.743804 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-os-release\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.743804 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743680 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-socket-dir-parent\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.743804 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-etc-kubernetes\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.744079 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.743884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-env-overrides\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.744386 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-system-cni-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.744386 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-conf-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.744533 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:37.744533 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744468 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-kubernetes\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.744533 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-cni-bin\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.744533 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-systemd-units\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.744743 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.744743 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-var-lib-kubelet\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.744743 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-hostroot\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.744743 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.744743 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-ovnkube-script-lib\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.744743 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:37.744743 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-sys\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744746 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92c4c570-25df-4201-b0cf-3fc5e5d442d8-cni-binary-copy\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-k8s-cni-cncf-io\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744791 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-kubelet\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-log-socket\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.744862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-run\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.745007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-tuned\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.745051 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.745069 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.745232 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.745514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.745390 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sqxxc\"" Apr 20 19:20:37.747292 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.747250 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:20:37.747417 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.747395 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wgwwp\"" Apr 20 19:20:37.747684 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.747522 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.747684 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.747578 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:37.747897 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.747787 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:20:37.791663 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.791620 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:37.832162 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.832141 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:20:37.845998 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.845964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-socket-dir-parent\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846097 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-etc-kubernetes\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846097 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-env-overrides\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.846097 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846048 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-cni-binary-copy\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.846220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-system-cni-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-etc-kubernetes\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846146 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-conf-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-system-cni-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846367 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846222 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-socket-dir-parent\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846367 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:37.846367 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-kubernetes\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.846470 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-kubernetes\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.846470 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-conf-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846578 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-cni-bin\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-systemd-units\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.846714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.846714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846646 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-cni-bin\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.846714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-var-lib-kubelet\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.846714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846699 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-systemd-units\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.846714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-registration-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-sys-fs\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-var-lib-kubelet\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-hostroot\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-env-overrides\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-hostroot\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-ovnkube-script-lib\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.846926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rwkc\" (UniqueName: \"kubernetes.io/projected/0d71716e-8f11-403e-be5a-e4087524a0fc-kube-api-access-8rwkc\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.847043 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-sys\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92c4c570-25df-4201-b0cf-3fc5e5d442d8-cni-binary-copy\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.847072 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-k8s-cni-cncf-io\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-sys\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-kubelet\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.847158 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:20:38.347124859 +0000 UTC m=+3.061496945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-kubelet\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-k8s-cni-cncf-io\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-log-socket\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-run\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-tuned\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-log-socket\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-run\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847367 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bd768ad4-6493-4653-aa46-ff5c53a0532e-hosts-file\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.847642 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-ovnkube-script-lib\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6vh7\" (UniqueName: \"kubernetes.io/projected/94a9964b-f6a5-4b72-8989-1efbd67f430d-kube-api-access-b6vh7\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/17c6e5a1-3d98-4126-b48d-b3e384ab3179-dbus\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8372aa91-c5a0-4714-939b-8dc6743d0b72-konnectivity-ca\") pod \"konnectivity-agent-gzt79\" (UID: \"8372aa91-c5a0-4714-939b-8dc6743d0b72\") " pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-cni-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847569 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-slash\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847644 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92c4c570-25df-4201-b0cf-3fc5e5d442d8-cni-binary-copy\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-cni-dir\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-slash\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-var-lib-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-cni-netd\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-var-lib-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-lib-modules\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-cni-netd\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bljdg\" (UniqueName: \"kubernetes.io/projected/2556de9e-929f-44f6-9c30-d010ac805c34-kube-api-access-bljdg\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-cnibin\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-daemon-config\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.848498 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvnml\" (UniqueName: \"kubernetes.io/projected/92c4c570-25df-4201-b0cf-3fc5e5d442d8-kube-api-access-zvnml\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-kubelet\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-lib-modules\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-cnibin\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysconfig\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.847994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-kubelet\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysconfig\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-device-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bd768ad4-6493-4653-aa46-ff5c53a0532e-tmp-dir\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8372aa91-c5a0-4714-939b-8dc6743d0b72-konnectivity-ca\") pod \"konnectivity-agent-gzt79\" (UID: \"8372aa91-c5a0-4714-939b-8dc6743d0b72\") " pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-multus-certs\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6sv\" (UniqueName: \"kubernetes.io/projected/ec1d5da3-6144-4314-be21-f06f578325c6-kube-api-access-ss6sv\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-modprobe-d\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysctl-conf\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f8d204a-6287-475e-8bb2-4e2081ea3788-host\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-multus-certs\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-system-cni-dir\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.849366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-modprobe-d\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-os-release\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-netns\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysctl-conf\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.848950 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/92c4c570-25df-4201-b0cf-3fc5e5d442d8-multus-daemon-config\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-run-netns\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-node-log\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-662kh\" (UniqueName: \"kubernetes.io/projected/5a91163e-e923-41e4-98ab-9b9dc9d412b6-kube-api-access-662kh\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849445 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/17c6e5a1-3d98-4126-b48d-b3e384ab3179-kubelet-config\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-etc-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-ovn\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-host\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-socket-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-etc-selinux\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysctl-d\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.850188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849749 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vlx\" (UniqueName: \"kubernetes.io/projected/bd768ad4-6493-4653-aa46-ff5c53a0532e-kube-api-access-64vlx\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-etc-openvswitch\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-node-log\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-cnibin\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-ovn\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2556de9e-929f-44f6-9c30-d010ac805c34-iptables-alerter-script\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-cni-multus\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.849958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-sysctl-d\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-systemd\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-host\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-cni-bin\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-run-systemd\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-cni-bin\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-host-var-lib-cni-multus\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850525 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2556de9e-929f-44f6-9c30-d010ac805c34-iptables-alerter-script\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-tuned\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.850164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-ovnkube-config\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.852480 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec1d5da3-6144-4314-be21-f06f578325c6-ovn-node-metrics-cert\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k47xq\" (UniqueName: \"kubernetes.io/projected/39c06111-8b7a-4d9f-a3de-f5c655ac387d-kube-api-access-k47xq\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwj2h\" (UniqueName: \"kubernetes.io/projected/8f8d204a-6287-475e-8bb2-4e2081ea3788-kube-api-access-fwj2h\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.851353 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.851380 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.851394 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s97jp for pod openshift-network-diagnostics/network-check-target-nnp2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851420 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2556de9e-929f-44f6-9c30-d010ac805c34-host-slash\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2556de9e-929f-44f6-9c30-d010ac805c34-host-slash\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-run-netns\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.851499 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp podName:bd765cc1-22af-43e0-a1bf-88a1ec201341 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:38.351474184 +0000 UTC m=+3.065846266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s97jp" (UniqueName: "kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp") pod "network-check-target-nnp2z" (UID: "bd765cc1-22af-43e0-a1bf-88a1ec201341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-run-netns\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-systemd\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a91163e-e923-41e4-98ab-9b9dc9d412b6-tmp\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851640 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851670 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f8d204a-6287-475e-8bb2-4e2081ea3788-serviceca\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8372aa91-c5a0-4714-939b-8dc6743d0b72-agent-certs\") pod \"konnectivity-agent-gzt79\" (UID: \"8372aa91-c5a0-4714-939b-8dc6743d0b72\") " pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:37.853334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-os-release\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.854142 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92c4c570-25df-4201-b0cf-3fc5e5d442d8-os-release\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.854142 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec1d5da3-6144-4314-be21-f06f578325c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.854142 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.851974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a91163e-e923-41e4-98ab-9b9dc9d412b6-etc-systemd\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.854142 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.852908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec1d5da3-6144-4314-be21-f06f578325c6-ovnkube-config\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.855287 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.855196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec1d5da3-6144-4314-be21-f06f578325c6-ovn-node-metrics-cert\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.855479 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.855371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a91163e-e923-41e4-98ab-9b9dc9d412b6-tmp\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.855863 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.855839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8372aa91-c5a0-4714-939b-8dc6743d0b72-agent-certs\") pod \"konnectivity-agent-gzt79\" (UID: \"8372aa91-c5a0-4714-939b-8dc6743d0b72\") " pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:37.857910 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.857877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6sv\" (UniqueName: \"kubernetes.io/projected/ec1d5da3-6144-4314-be21-f06f578325c6-kube-api-access-ss6sv\") pod \"ovnkube-node-rhxmj\" (UID: \"ec1d5da3-6144-4314-be21-f06f578325c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:37.858586 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.858562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvnml\" (UniqueName: \"kubernetes.io/projected/92c4c570-25df-4201-b0cf-3fc5e5d442d8-kube-api-access-zvnml\") pod \"multus-fdm6h\" (UID: \"92c4c570-25df-4201-b0cf-3fc5e5d442d8\") " pod="openshift-multus/multus-fdm6h" Apr 20 19:20:37.859094 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.859076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-662kh\" (UniqueName: \"kubernetes.io/projected/5a91163e-e923-41e4-98ab-9b9dc9d412b6-kube-api-access-662kh\") pod \"tuned-6774v\" (UID: \"5a91163e-e923-41e4-98ab-9b9dc9d412b6\") " pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:37.859454 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.859430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bljdg\" (UniqueName: \"kubernetes.io/projected/2556de9e-929f-44f6-9c30-d010ac805c34-kube-api-access-bljdg\") pod \"iptables-alerter-8pstc\" (UID: \"2556de9e-929f-44f6-9c30-d010ac805c34\") " pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:37.861377 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.861357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k47xq\" (UniqueName: \"kubernetes.io/projected/39c06111-8b7a-4d9f-a3de-f5c655ac387d-kube-api-access-k47xq\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:37.936580 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.936506 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:15:36 +0000 UTC" deadline="2028-01-21 20:12:15.056349626 +0000 UTC" Apr 20 19:20:37.936580 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.936533 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15384h51m37.119819323s" Apr 20 19:20:37.952818 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rwkc\" (UniqueName: \"kubernetes.io/projected/0d71716e-8f11-403e-be5a-e4087524a0fc-kube-api-access-8rwkc\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.952948 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bd768ad4-6493-4653-aa46-ff5c53a0532e-hosts-file\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.952948 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.952948 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.952948 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6vh7\" (UniqueName: \"kubernetes.io/projected/94a9964b-f6a5-4b72-8989-1efbd67f430d-kube-api-access-b6vh7\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.952948 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/17c6e5a1-3d98-4126-b48d-b3e384ab3179-dbus\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.952948 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-device-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bd768ad4-6493-4653-aa46-ff5c53a0532e-tmp-dir\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f8d204a-6287-475e-8bb2-4e2081ea3788-host\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.952983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bd768ad4-6493-4653-aa46-ff5c53a0532e-hosts-file\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-system-cni-dir\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-device-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-system-cni-dir\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-os-release\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953106 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f8d204a-6287-475e-8bb2-4e2081ea3788-host\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/17c6e5a1-3d98-4126-b48d-b3e384ab3179-dbus\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-os-release\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953200 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/17c6e5a1-3d98-4126-b48d-b3e384ab3179-kubelet-config\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/17c6e5a1-3d98-4126-b48d-b3e384ab3179-kubelet-config\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-socket-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-etc-selinux\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64vlx\" (UniqueName: \"kubernetes.io/projected/bd768ad4-6493-4653-aa46-ff5c53a0532e-kube-api-access-64vlx\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-cnibin\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bd768ad4-6493-4653-aa46-ff5c53a0532e-tmp-dir\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwj2h\" (UniqueName: \"kubernetes.io/projected/8f8d204a-6287-475e-8bb2-4e2081ea3788-kube-api-access-fwj2h\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-etc-selinux\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f8d204a-6287-475e-8bb2-4e2081ea3788-serviceca\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953437 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94a9964b-f6a5-4b72-8989-1efbd67f430d-cnibin\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953501 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-socket-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-cni-binary-copy\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.953812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-registration-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.954366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-sys-fs\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.954366 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.953671 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:37.954366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953675 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.954366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-registration-dir\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.954366 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:37.953750 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret podName:17c6e5a1-3d98-4126-b48d-b3e384ab3179 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:38.453717162 +0000 UTC m=+3.168089232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret") pod "global-pull-secret-syncer-9r8xq" (UID: "17c6e5a1-3d98-4126-b48d-b3e384ab3179") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:37.954366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0d71716e-8f11-403e-be5a-e4087524a0fc-sys-fs\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.954366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.953865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f8d204a-6287-475e-8bb2-4e2081ea3788-serviceca\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:37.954683 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.954666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94a9964b-f6a5-4b72-8989-1efbd67f430d-cni-binary-copy\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.961655 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.961633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6vh7\" (UniqueName: \"kubernetes.io/projected/94a9964b-f6a5-4b72-8989-1efbd67f430d-kube-api-access-b6vh7\") pod \"multus-additional-cni-plugins-k75h7\" (UID: \"94a9964b-f6a5-4b72-8989-1efbd67f430d\") " pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:37.961835 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.961811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rwkc\" (UniqueName: \"kubernetes.io/projected/0d71716e-8f11-403e-be5a-e4087524a0fc-kube-api-access-8rwkc\") pod \"aws-ebs-csi-driver-node-xktbz\" (UID: \"0d71716e-8f11-403e-be5a-e4087524a0fc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:37.962599 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.962582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vlx\" (UniqueName: \"kubernetes.io/projected/bd768ad4-6493-4653-aa46-ff5c53a0532e-kube-api-access-64vlx\") pod \"node-resolver-lnjzz\" (UID: \"bd768ad4-6493-4653-aa46-ff5c53a0532e\") " pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:37.962832 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:37.962814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwj2h\" (UniqueName: \"kubernetes.io/projected/8f8d204a-6287-475e-8bb2-4e2081ea3788-kube-api-access-fwj2h\") pod \"node-ca-rnzz5\" (UID: \"8f8d204a-6287-475e-8bb2-4e2081ea3788\") " pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:38.033845 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.033822 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:20:38.043291 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.043269 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8pstc" Apr 20 19:20:38.052827 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.052804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:38.059544 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.059526 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fdm6h" Apr 20 19:20:38.067107 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.067090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6774v" Apr 20 19:20:38.074643 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.074623 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" Apr 20 19:20:38.081162 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.081147 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rnzz5" Apr 20 19:20:38.089686 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.089669 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lnjzz" Apr 20 19:20:38.094254 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.094237 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k75h7" Apr 20 19:20:38.356587 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.356559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:38.356741 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.356596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:38.356741 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:38.356738 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:38.356824 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:38.356770 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:38.356824 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:38.356786 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:20:39.356773459 +0000 UTC m=+4.071145524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:38.356824 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:38.356793 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:38.356824 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:38.356806 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s97jp for pod openshift-network-diagnostics/network-check-target-nnp2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:38.356970 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:38.356857 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp podName:bd765cc1-22af-43e0-a1bf-88a1ec201341 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:39.356840896 +0000 UTC m=+4.071212983 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s97jp" (UniqueName: "kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp") pod "network-check-target-nnp2z" (UID: "bd765cc1-22af-43e0-a1bf-88a1ec201341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:38.457755 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.457708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:38.457918 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:38.457853 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:38.457990 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:38.457933 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret podName:17c6e5a1-3d98-4126-b48d-b3e384ab3179 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:39.457913198 +0000 UTC m=+4.172285267 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret") pod "global-pull-secret-syncer-9r8xq" (UID: "17c6e5a1-3d98-4126-b48d-b3e384ab3179") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:38.595795 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.595770 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d71716e_8f11_403e_be5a_e4087524a0fc.slice/crio-624a348c6cb8072228d50582ab16b8c913ac3b2aed839d1fbf6ff1aa1dfe6478 WatchSource:0}: Error finding container 624a348c6cb8072228d50582ab16b8c913ac3b2aed839d1fbf6ff1aa1dfe6478: Status 404 returned error can't find the container with id 624a348c6cb8072228d50582ab16b8c913ac3b2aed839d1fbf6ff1aa1dfe6478 Apr 20 19:20:38.597241 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.597218 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd768ad4_6493_4653_aa46_ff5c53a0532e.slice/crio-59c73a77c8a166502b745018b29e5381de86d72dfba687970a86f0c72f910037 WatchSource:0}: Error finding container 59c73a77c8a166502b745018b29e5381de86d72dfba687970a86f0c72f910037: Status 404 returned error can't find the container with id 59c73a77c8a166502b745018b29e5381de86d72dfba687970a86f0c72f910037 Apr 20 19:20:38.599029 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.599004 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2556de9e_929f_44f6_9c30_d010ac805c34.slice/crio-cc683f429c969055d3cfe709957b03d1a1ea2185171a6132743af1a4ba8d9cc7 WatchSource:0}: Error finding container cc683f429c969055d3cfe709957b03d1a1ea2185171a6132743af1a4ba8d9cc7: Status 404 returned error can't find the container with id cc683f429c969055d3cfe709957b03d1a1ea2185171a6132743af1a4ba8d9cc7 Apr 20 19:20:38.602088 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.602067 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec1d5da3_6144_4314_be21_f06f578325c6.slice/crio-22c1f75ec0d035f7edbded6baa0b39af6ee913a54c65dbb8680c8c19a3595813 WatchSource:0}: Error finding container 22c1f75ec0d035f7edbded6baa0b39af6ee913a54c65dbb8680c8c19a3595813: Status 404 returned error can't find the container with id 22c1f75ec0d035f7edbded6baa0b39af6ee913a54c65dbb8680c8c19a3595813 Apr 20 19:20:38.604530 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.604480 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a9964b_f6a5_4b72_8989_1efbd67f430d.slice/crio-90d2934614a53a6247968ee639510afdfec9d6827d6b8b03ea9a3516216bcd01 WatchSource:0}: Error finding container 90d2934614a53a6247968ee639510afdfec9d6827d6b8b03ea9a3516216bcd01: Status 404 returned error can't find the container with id 90d2934614a53a6247968ee639510afdfec9d6827d6b8b03ea9a3516216bcd01 Apr 20 19:20:38.605100 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.605073 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c4c570_25df_4201_b0cf_3fc5e5d442d8.slice/crio-97e66c8192b9abf0efe6ca6e079448418e39b653b338b4076fabfef2beb27d51 WatchSource:0}: Error finding container 97e66c8192b9abf0efe6ca6e079448418e39b653b338b4076fabfef2beb27d51: Status 404 returned error can't find the container with id 97e66c8192b9abf0efe6ca6e079448418e39b653b338b4076fabfef2beb27d51 Apr 20 19:20:38.606143 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.606122 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f8d204a_6287_475e_8bb2_4e2081ea3788.slice/crio-9c8fb56f1aa097589d6312679e9c7b2f5038a146e8bad971ae72da1a21ff4aa2 WatchSource:0}: Error finding container 9c8fb56f1aa097589d6312679e9c7b2f5038a146e8bad971ae72da1a21ff4aa2: Status 404 returned error can't find the container with id 9c8fb56f1aa097589d6312679e9c7b2f5038a146e8bad971ae72da1a21ff4aa2 Apr 20 19:20:38.607108 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.607060 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a91163e_e923_41e4_98ab_9b9dc9d412b6.slice/crio-5be79125fdf07ea1ec162b475056dd64ac18176685a00ba136903a1516372dc9 WatchSource:0}: Error finding container 5be79125fdf07ea1ec162b475056dd64ac18176685a00ba136903a1516372dc9: Status 404 returned error can't find the container with id 5be79125fdf07ea1ec162b475056dd64ac18176685a00ba136903a1516372dc9 Apr 20 19:20:38.608486 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:20:38.608447 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8372aa91_c5a0_4714_939b_8dc6743d0b72.slice/crio-0776d01e4e88fdf0aa1a2421d0b2cc41bf5d3f952978b4142ed559def250288e WatchSource:0}: Error finding container 0776d01e4e88fdf0aa1a2421d0b2cc41bf5d3f952978b4142ed559def250288e: Status 404 returned error can't find the container with id 0776d01e4e88fdf0aa1a2421d0b2cc41bf5d3f952978b4142ed559def250288e Apr 20 19:20:38.745677 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.745527 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:38.836898 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.836860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6774v" event={"ID":"5a91163e-e923-41e4-98ab-9b9dc9d412b6","Type":"ContainerStarted","Data":"5be79125fdf07ea1ec162b475056dd64ac18176685a00ba136903a1516372dc9"} Apr 20 19:20:38.837783 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.837754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerStarted","Data":"90d2934614a53a6247968ee639510afdfec9d6827d6b8b03ea9a3516216bcd01"} Apr 20 19:20:38.839452 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.839424 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"22c1f75ec0d035f7edbded6baa0b39af6ee913a54c65dbb8680c8c19a3595813"} Apr 20 19:20:38.841327 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.841301 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lnjzz" event={"ID":"bd768ad4-6493-4653-aa46-ff5c53a0532e","Type":"ContainerStarted","Data":"59c73a77c8a166502b745018b29e5381de86d72dfba687970a86f0c72f910037"} Apr 20 19:20:38.844215 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.844179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" event={"ID":"562c22904357368d150bcfb5b4deac02","Type":"ContainerStarted","Data":"ed7f4b87207fbef6ca369a66f9a0de4a7b02e6bbdbe7d98034dd21cc43a528ce"} Apr 20 19:20:38.845943 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.845897 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rnzz5" event={"ID":"8f8d204a-6287-475e-8bb2-4e2081ea3788","Type":"ContainerStarted","Data":"9c8fb56f1aa097589d6312679e9c7b2f5038a146e8bad971ae72da1a21ff4aa2"} Apr 20 19:20:38.846878 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.846858 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fdm6h" event={"ID":"92c4c570-25df-4201-b0cf-3fc5e5d442d8","Type":"ContainerStarted","Data":"97e66c8192b9abf0efe6ca6e079448418e39b653b338b4076fabfef2beb27d51"} Apr 20 19:20:38.848079 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.848057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8pstc" event={"ID":"2556de9e-929f-44f6-9c30-d010ac805c34","Type":"ContainerStarted","Data":"cc683f429c969055d3cfe709957b03d1a1ea2185171a6132743af1a4ba8d9cc7"} Apr 20 19:20:38.849769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.849750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" event={"ID":"0d71716e-8f11-403e-be5a-e4087524a0fc","Type":"ContainerStarted","Data":"624a348c6cb8072228d50582ab16b8c913ac3b2aed839d1fbf6ff1aa1dfe6478"} Apr 20 19:20:38.852840 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.852818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gzt79" event={"ID":"8372aa91-c5a0-4714-939b-8dc6743d0b72","Type":"ContainerStarted","Data":"0776d01e4e88fdf0aa1a2421d0b2cc41bf5d3f952978b4142ed559def250288e"} Apr 20 19:20:38.937556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.937431 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:15:36 +0000 UTC" deadline="2027-11-20 22:47:28.191852891 +0000 UTC" Apr 20 19:20:38.937556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:38.937472 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13899h26m49.25438434s" Apr 20 19:20:39.365126 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.364219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:39.365126 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.364271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:39.365126 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.364505 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:39.365126 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.364526 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:39.365126 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.364537 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s97jp for pod openshift-network-diagnostics/network-check-target-nnp2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:39.365126 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.364595 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp podName:bd765cc1-22af-43e0-a1bf-88a1ec201341 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:41.364575043 +0000 UTC m=+6.078947132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s97jp" (UniqueName: "kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp") pod "network-check-target-nnp2z" (UID: "bd765cc1-22af-43e0-a1bf-88a1ec201341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:39.365126 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.364668 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:39.365126 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.364709 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:20:41.364696189 +0000 UTC m=+6.079068260 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:39.465532 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.465501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:39.465712 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.465694 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:39.465876 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.465797 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret podName:17c6e5a1-3d98-4126-b48d-b3e384ab3179 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:41.465778179 +0000 UTC m=+6.180150262 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret") pod "global-pull-secret-syncer-9r8xq" (UID: "17c6e5a1-3d98-4126-b48d-b3e384ab3179") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:39.831899 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.831870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:39.832326 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.831993 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:39.832326 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.832015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:39.832326 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.832117 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:39.832326 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.831878 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:39.832326 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:39.832218 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:39.866417 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.866362 2577 generic.go:358] "Generic (PLEG): container finished" podID="7cfb13cf22483fad0841f9bb06885f79" containerID="9ee921374dab80345a6f2b4c9569e955f8ceb70cb5d9c54a132366a183b2da22" exitCode=0 Apr 20 19:20:39.867280 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.867180 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" event={"ID":"7cfb13cf22483fad0841f9bb06885f79","Type":"ContainerDied","Data":"9ee921374dab80345a6f2b4c9569e955f8ceb70cb5d9c54a132366a183b2da22"} Apr 20 19:20:39.882140 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:39.882078 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-149.ec2.internal" podStartSLOduration=2.882061239 podStartE2EDuration="2.882061239s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:20:38.860159009 +0000 UTC m=+3.574531094" watchObservedRunningTime="2026-04-20 19:20:39.882061239 +0000 UTC m=+4.596433328" Apr 20 19:20:40.884989 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:40.884952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" event={"ID":"7cfb13cf22483fad0841f9bb06885f79","Type":"ContainerStarted","Data":"6467f99fde178ff976ee5ff7284befc2edf885ac816a7c5946b737e357bb0cff"} Apr 20 19:20:41.381357 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:41.381310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:41.381547 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:41.381368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:41.381547 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.381524 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:41.381658 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.381583 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:20:45.381565161 +0000 UTC m=+10.095937230 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:41.381745 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.381666 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:41.381745 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.381679 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:41.381745 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.381692 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s97jp for pod openshift-network-diagnostics/network-check-target-nnp2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:41.381745 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.381742 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp podName:bd765cc1-22af-43e0-a1bf-88a1ec201341 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:45.381714996 +0000 UTC m=+10.096087066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s97jp" (UniqueName: "kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp") pod "network-check-target-nnp2z" (UID: "bd765cc1-22af-43e0-a1bf-88a1ec201341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:41.482892 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:41.482346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:41.482892 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.482495 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:41.482892 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.482554 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret podName:17c6e5a1-3d98-4126-b48d-b3e384ab3179 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:45.482537763 +0000 UTC m=+10.196909834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret") pod "global-pull-secret-syncer-9r8xq" (UID: "17c6e5a1-3d98-4126-b48d-b3e384ab3179") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:41.832750 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:41.831194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:41.832750 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.831313 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:41.832750 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:41.831714 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:41.832750 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.831833 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:41.832750 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:41.831917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:41.832750 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:41.831988 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:43.831551 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:43.830669 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:43.831551 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:43.830815 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:43.831551 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:43.831375 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:43.831551 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:43.831490 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:43.832207 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:43.831670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:43.832207 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:43.831793 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:45.415486 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:45.415351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:45.415486 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:45.415412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:45.416003 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.415546 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:45.416003 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.415561 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:45.416003 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.415579 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:45.416003 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.415591 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s97jp for pod openshift-network-diagnostics/network-check-target-nnp2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:45.416003 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.415618 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:20:53.41559973 +0000 UTC m=+18.129971799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:45.416003 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.415635 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp podName:bd765cc1-22af-43e0-a1bf-88a1ec201341 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:53.415626643 +0000 UTC m=+18.129998712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s97jp" (UniqueName: "kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp") pod "network-check-target-nnp2z" (UID: "bd765cc1-22af-43e0-a1bf-88a1ec201341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:45.517030 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:45.516447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:45.517030 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.516621 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:45.517030 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.516683 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret podName:17c6e5a1-3d98-4126-b48d-b3e384ab3179 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:53.516664594 +0000 UTC m=+18.231036663 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret") pod "global-pull-secret-syncer-9r8xq" (UID: "17c6e5a1-3d98-4126-b48d-b3e384ab3179") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:45.832154 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:45.831484 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:45.832154 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:45.831605 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:45.832154 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.831685 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:45.832154 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.831741 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:45.832154 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:45.831779 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:45.832154 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:45.831843 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:47.831710 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:47.831262 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:47.831710 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:47.831376 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:47.831710 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:47.831429 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:47.831710 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:47.831540 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:47.831710 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:47.831561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:47.831710 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:47.831673 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:49.831318 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:49.831284 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:49.831318 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:49.831303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:49.831905 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:49.831284 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:49.831905 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:49.831409 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:49.831905 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:49.831516 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:49.831905 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:49.831646 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:51.830626 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:51.830591 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:51.831090 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:51.830591 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:51.831090 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:51.830736 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:51.831090 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:51.830598 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:51.831090 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:51.830833 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:51.831090 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:51.830903 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:53.472619 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:53.472576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:53.473074 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:53.472632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:53.473074 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.472757 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:53.473074 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.472778 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:53.473074 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.472803 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:53.473074 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.472816 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s97jp for pod openshift-network-diagnostics/network-check-target-nnp2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:53.473074 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.472827 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.47280722 +0000 UTC m=+34.187179291 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:53.473074 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.472869 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp podName:bd765cc1-22af-43e0-a1bf-88a1ec201341 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.472852427 +0000 UTC m=+34.187224498 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s97jp" (UniqueName: "kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp") pod "network-check-target-nnp2z" (UID: "bd765cc1-22af-43e0-a1bf-88a1ec201341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:53.573466 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:53.573435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:53.573618 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.573579 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:53.573674 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.573634 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret podName:17c6e5a1-3d98-4126-b48d-b3e384ab3179 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.573618131 +0000 UTC m=+34.287990210 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret") pod "global-pull-secret-syncer-9r8xq" (UID: "17c6e5a1-3d98-4126-b48d-b3e384ab3179") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:53.830305 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:53.830266 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:53.830483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:53.830279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:53.830483 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.830383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:53.830589 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.830486 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:53.830589 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:53.830279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:53.830589 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:53.830571 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:55.832217 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:55.832185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:55.832575 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:55.832329 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:55.833507 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:55.833467 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:55.834840 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:55.834817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:55.834999 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:55.834980 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:55.835055 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:55.834882 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:55.913411 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:55.913195 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gzt79" event={"ID":"8372aa91-c5a0-4714-939b-8dc6743d0b72","Type":"ContainerStarted","Data":"47f90954f1fe285f5f0bff66d036ce48106225796ec8541f184e0f8ee71e905c"} Apr 20 19:20:55.916871 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:55.916844 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6774v" event={"ID":"5a91163e-e923-41e4-98ab-9b9dc9d412b6","Type":"ContainerStarted","Data":"8ad7d52524dcc96a07d4571c66fade16ad7c2e6b4de0a5532135c5bec52a38a2"} Apr 20 19:20:55.929588 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:55.929542 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-149.ec2.internal" podStartSLOduration=18.92952603 podStartE2EDuration="18.92952603s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:20:40.911931017 +0000 UTC m=+5.626303106" watchObservedRunningTime="2026-04-20 19:20:55.92952603 +0000 UTC m=+20.643898118" Apr 20 19:20:55.948782 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:55.948553 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gzt79" podStartSLOduration=11.984660135 podStartE2EDuration="20.948536937s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.612036787 +0000 UTC m=+3.326408866" lastFinishedPulling="2026-04-20 19:20:47.575913584 +0000 UTC m=+12.290285668" observedRunningTime="2026-04-20 19:20:55.93004717 +0000 UTC m=+20.644419259" watchObservedRunningTime="2026-04-20 19:20:55.948536937 +0000 UTC m=+20.662909028" Apr 20 19:20:55.948882 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:55.948789 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6774v" podStartSLOduration=3.884989624 podStartE2EDuration="20.948778441s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.610499911 +0000 UTC m=+3.324871982" lastFinishedPulling="2026-04-20 19:20:55.674288726 +0000 UTC m=+20.388660799" observedRunningTime="2026-04-20 19:20:55.947639903 +0000 UTC m=+20.662011993" watchObservedRunningTime="2026-04-20 19:20:55.948778441 +0000 UTC m=+20.663150530" Apr 20 19:20:56.438902 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.438636 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:56.439501 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.439439 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:56.920022 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.919978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fdm6h" event={"ID":"92c4c570-25df-4201-b0cf-3fc5e5d442d8","Type":"ContainerStarted","Data":"f65d30d8ed1534d2cc3a7f7b2f833ee16b2f47aa70451bbc504681c6b9bc3a13"} Apr 20 19:20:56.921296 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.921275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" event={"ID":"0d71716e-8f11-403e-be5a-e4087524a0fc","Type":"ContainerStarted","Data":"23c48aafa582052b3de391385431926109aad025db04e03f4559b7d272499f6b"} Apr 20 19:20:56.922561 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.922538 2577 generic.go:358] "Generic (PLEG): container finished" podID="94a9964b-f6a5-4b72-8989-1efbd67f430d" containerID="f0342b13f49d22c8a661732cc1acc7aa9d4a4add956b167a41ed3f75151dbeac" exitCode=0 Apr 20 19:20:56.922644 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.922604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerDied","Data":"f0342b13f49d22c8a661732cc1acc7aa9d4a4add956b167a41ed3f75151dbeac"} Apr 20 19:20:56.925010 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.924993 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:20:56.925317 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.925297 2577 generic.go:358] "Generic (PLEG): container finished" podID="ec1d5da3-6144-4314-be21-f06f578325c6" containerID="658898b1480f751c3b047208458ed7742db35259d12c2e492ff784de6f589ca8" exitCode=1 Apr 20 19:20:56.925394 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.925368 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"48551c5f464484ee5cb29fe217a96f3fad7575f301889fa11b1552a693377f0f"} Apr 20 19:20:56.925394 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.925391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"c5886d9e9405ef55501da5edf45f0364358e789e76ced1f5db001fd70f529f57"} Apr 20 19:20:56.925497 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.925400 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"d0a5656b2c632ba000427762bad35f635b091e740d05d6b8f923e33a264a735d"} Apr 20 19:20:56.925497 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.925412 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"184c38a0fbf6357c04ecf9824f7d620c01607a4c36c33d4ac18a073c11791b83"} Apr 20 19:20:56.925497 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.925426 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerDied","Data":"658898b1480f751c3b047208458ed7742db35259d12c2e492ff784de6f589ca8"} Apr 20 19:20:56.925497 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.925443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"783d952f87a220c1bd336247c47e4d0a3837f0f60d9f63116cae9a25cf150c26"} Apr 20 19:20:56.926489 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.926473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lnjzz" event={"ID":"bd768ad4-6493-4653-aa46-ff5c53a0532e","Type":"ContainerStarted","Data":"eadfa7876f53b9c06ff2d61997a62f0464077d1bfa79a1f254484107ccde6650"} Apr 20 19:20:56.927646 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.927620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rnzz5" event={"ID":"8f8d204a-6287-475e-8bb2-4e2081ea3788","Type":"ContainerStarted","Data":"de8bb5e38e7d287842c5f103987917a1f6c21989c46dfeb7ed29e009bea1ccee"} Apr 20 19:20:56.927933 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.927919 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:56.928312 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.928296 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gzt79" Apr 20 19:20:56.976083 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.976042 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lnjzz" podStartSLOduration=3.903518558 podStartE2EDuration="20.976031604s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.599896895 +0000 UTC m=+3.314268965" lastFinishedPulling="2026-04-20 19:20:55.672409932 +0000 UTC m=+20.386782011" observedRunningTime="2026-04-20 19:20:56.975523756 +0000 UTC m=+21.689895843" watchObservedRunningTime="2026-04-20 19:20:56.976031604 +0000 UTC m=+21.690403692" Apr 20 19:20:56.976256 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:56.976231 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fdm6h" podStartSLOduration=4.887885842 podStartE2EDuration="21.976224508s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.606935892 +0000 UTC m=+3.321307971" lastFinishedPulling="2026-04-20 19:20:55.695274571 +0000 UTC m=+20.409646637" observedRunningTime="2026-04-20 19:20:56.948588689 +0000 UTC m=+21.662960777" watchObservedRunningTime="2026-04-20 19:20:56.976224508 +0000 UTC m=+21.690596595" Apr 20 19:20:57.047892 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.047842 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rnzz5" podStartSLOduration=4.984182013 podStartE2EDuration="22.047827494s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.608380624 +0000 UTC m=+3.322752697" lastFinishedPulling="2026-04-20 19:20:55.672026104 +0000 UTC m=+20.386398178" observedRunningTime="2026-04-20 19:20:57.016945414 +0000 UTC m=+21.731317503" watchObservedRunningTime="2026-04-20 19:20:57.047827494 +0000 UTC m=+21.762199598" Apr 20 19:20:57.149835 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.149813 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:20:57.795821 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.795677 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:20:57.149829734Z","UUID":"6142b1c9-e9e7-4328-953f-6676afe8b15b","Handler":null,"Name":"","Endpoint":""} Apr 20 19:20:57.797486 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.797460 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:20:57.797486 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.797491 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:20:57.831276 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.831246 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:57.831405 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:57.831369 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:57.831590 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.831567 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:57.831698 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:57.831675 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:57.831837 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.831819 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:57.831932 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:57.831912 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:20:57.931752 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.931700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8pstc" event={"ID":"2556de9e-929f-44f6-9c30-d010ac805c34","Type":"ContainerStarted","Data":"4df03985a8305efd84872698a52161512c98ae88e7e56522c859761c28bc53ac"} Apr 20 19:20:57.934366 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:57.934337 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" event={"ID":"0d71716e-8f11-403e-be5a-e4087524a0fc","Type":"ContainerStarted","Data":"175bd66bd46fb413e91df10b0acbf4bb7d403299437c9f3453488c50b597ad93"} Apr 20 19:20:58.939562 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:58.939295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" event={"ID":"0d71716e-8f11-403e-be5a-e4087524a0fc","Type":"ContainerStarted","Data":"f8696cdbca04983073ff5c3c336faafe9f23963176597a9064c39a9e8941e4a3"} Apr 20 19:20:58.942904 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:58.942878 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:20:58.943208 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:58.943185 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"7ef23b3db30247e8b52cba8c980a295f4d13e59c6732f220fad8b0149f0e518f"} Apr 20 19:20:58.965789 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:58.965749 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xktbz" podStartSLOduration=4.238393506 podStartE2EDuration="23.965719535s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.59775726 +0000 UTC m=+3.312129341" lastFinishedPulling="2026-04-20 19:20:58.325083291 +0000 UTC m=+23.039455370" observedRunningTime="2026-04-20 19:20:58.965665279 +0000 UTC m=+23.680037368" watchObservedRunningTime="2026-04-20 19:20:58.965719535 +0000 UTC m=+23.680091623" Apr 20 19:20:58.966015 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:58.965991 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8pstc" podStartSLOduration=6.894466594 podStartE2EDuration="23.965980669s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.601012836 +0000 UTC m=+3.315384907" lastFinishedPulling="2026-04-20 19:20:55.672526905 +0000 UTC m=+20.386898982" observedRunningTime="2026-04-20 19:20:57.955918175 +0000 UTC m=+22.670290265" watchObservedRunningTime="2026-04-20 19:20:58.965980669 +0000 UTC m=+23.680352756" Apr 20 19:20:59.830904 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:59.830876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:20:59.831110 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:59.831003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:20:59.831110 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:20:59.831029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:20:59.831227 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:59.831000 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:20:59.831227 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:59.831125 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:20:59.831335 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:20:59.831221 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:21:00.951783 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:00.951692 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:21:00.953017 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:00.952110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"6089293fd861c1b00c63afeebfa50690ccb2077293c26ae7dc246722a8511eca"} Apr 20 19:21:00.953017 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:00.952463 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:21:00.953017 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:00.952662 2577 scope.go:117] "RemoveContainer" containerID="658898b1480f751c3b047208458ed7742db35259d12c2e492ff784de6f589ca8" Apr 20 19:21:00.972149 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:00.972096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:21:01.832563 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.832540 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:01.832716 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.832575 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:01.832716 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:01.832638 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:21:01.832716 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:01.832698 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:21:01.832847 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.832712 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:21:01.832847 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:01.832829 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:21:01.955124 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.955098 2577 generic.go:358] "Generic (PLEG): container finished" podID="94a9964b-f6a5-4b72-8989-1efbd67f430d" containerID="4acb918c2c83e31ef2ab715e0ca8c055c125a53b4e88a506e0bd1e0daf332cd0" exitCode=0 Apr 20 19:21:01.955538 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.955164 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerDied","Data":"4acb918c2c83e31ef2ab715e0ca8c055c125a53b4e88a506e0bd1e0daf332cd0"} Apr 20 19:21:01.958534 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.958515 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:21:01.958823 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.958797 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" event={"ID":"ec1d5da3-6144-4314-be21-f06f578325c6","Type":"ContainerStarted","Data":"7d45efeef4b077341efea6eac5cce6d900b4a1ffb9779eee19f85d56420dffe0"} Apr 20 19:21:01.959308 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.959284 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 19:21:01.961957 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.960508 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:21:01.975895 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:01.975874 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:21:02.007072 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:02.007038 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" podStartSLOduration=9.883281056 podStartE2EDuration="27.007027953s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.603433533 +0000 UTC m=+3.317805598" lastFinishedPulling="2026-04-20 19:20:55.727180411 +0000 UTC m=+20.441552495" observedRunningTime="2026-04-20 19:21:02.00595979 +0000 UTC m=+26.720331879" watchObservedRunningTime="2026-04-20 19:21:02.007027953 +0000 UTC m=+26.721400042" Apr 20 19:21:02.807381 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:02.807354 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nnp2z"] Apr 20 19:21:02.807533 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:02.807468 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:02.807578 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:02.807551 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:21:02.811684 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:02.811646 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9r8xq"] Apr 20 19:21:02.811819 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:02.811787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:02.811915 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:02.811892 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:21:02.812262 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:02.812238 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tssws"] Apr 20 19:21:02.812384 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:02.812323 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:21:02.812448 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:02.812405 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:21:02.960965 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:02.960900 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 19:21:03.946481 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:03.946312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:21:03.964501 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:03.964476 2577 generic.go:358] "Generic (PLEG): container finished" podID="94a9964b-f6a5-4b72-8989-1efbd67f430d" containerID="b5222ca79a571844485616c132454f3318386f960f85c1b039552ad1fd160c84" exitCode=0 Apr 20 19:21:03.964823 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:03.964555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerDied","Data":"b5222ca79a571844485616c132454f3318386f960f85c1b039552ad1fd160c84"} Apr 20 19:21:04.831159 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:04.831118 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:04.831355 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:04.831225 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:21:04.831355 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:04.831222 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:21:04.831355 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:04.831238 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:04.831505 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:04.831348 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:21:04.831505 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:04.831408 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:21:05.970250 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:05.970169 2577 generic.go:358] "Generic (PLEG): container finished" podID="94a9964b-f6a5-4b72-8989-1efbd67f430d" containerID="7471ff65652fda81035ca58390a547f4da4e33b2ca3dca5e4cb757230893bfe7" exitCode=0 Apr 20 19:21:05.970250 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:05.970222 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerDied","Data":"7471ff65652fda81035ca58390a547f4da4e33b2ca3dca5e4cb757230893bfe7"} Apr 20 19:21:06.830596 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:06.830564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:21:06.830596 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:06.830596 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:06.830831 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:06.830674 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:21:06.830831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:06.830743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:06.830905 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:06.830823 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:21:06.830940 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:06.830899 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:21:08.831342 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:08.831260 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:08.831970 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:08.831260 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:21:08.831970 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:08.831378 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnp2z" podUID="bd765cc1-22af-43e0-a1bf-88a1ec201341" Apr 20 19:21:08.831970 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:08.831475 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:21:08.831970 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:08.831275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:08.831970 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:08.831574 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9r8xq" podUID="17c6e5a1-3d98-4126-b48d-b3e384ab3179" Apr 20 19:21:09.100405 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.100327 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-149.ec2.internal" event="NodeReady" Apr 20 19:21:09.100563 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.100463 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:21:09.150854 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.150772 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7cqnx"] Apr 20 19:21:09.164431 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.164406 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qksj4"] Apr 20 19:21:09.164570 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.164549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:09.169549 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.169496 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:21:09.169549 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.169518 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:21:09.169549 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.169518 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:21:09.169790 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.169594 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ltv6t\"" Apr 20 19:21:09.187415 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.187390 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7cqnx"] Apr 20 19:21:09.187415 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.187414 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qksj4"] Apr 20 19:21:09.187555 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.187525 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.189673 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.189656 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:21:09.190154 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.190141 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:21:09.190220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.190202 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qtv7w\"" Apr 20 19:21:09.295381 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.295349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.295564 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.295398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68d9g\" (UniqueName: \"kubernetes.io/projected/4a717388-605c-4d9d-8381-4bbf7fe371fb-kube-api-access-68d9g\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:09.295564 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.295429 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-config-volume\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.295564 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.295458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbxk\" (UniqueName: \"kubernetes.io/projected/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-kube-api-access-rzbxk\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.295564 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.295502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-tmp-dir\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.295756 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.295582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:09.396735 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.396638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.396735 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.396682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68d9g\" (UniqueName: \"kubernetes.io/projected/4a717388-605c-4d9d-8381-4bbf7fe371fb-kube-api-access-68d9g\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:09.396953 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.396792 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:09.396953 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.396856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-config-volume\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.396953 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.396875 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls podName:00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.896852447 +0000 UTC m=+34.611224513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls") pod "dns-default-qksj4" (UID: "00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd") : secret "dns-default-metrics-tls" not found Apr 20 19:21:09.397102 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.396999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbxk\" (UniqueName: \"kubernetes.io/projected/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-kube-api-access-rzbxk\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.397102 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.397059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-tmp-dir\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.397200 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.397153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:09.397290 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.397270 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:09.397356 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.397346 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert podName:4a717388-605c-4d9d-8381-4bbf7fe371fb nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.897328919 +0000 UTC m=+34.611701005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert") pod "ingress-canary-7cqnx" (UID: "4a717388-605c-4d9d-8381-4bbf7fe371fb") : secret "canary-serving-cert" not found Apr 20 19:21:09.397418 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.397397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-config-volume\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.397469 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.397449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-tmp-dir\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.408215 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.408188 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbxk\" (UniqueName: \"kubernetes.io/projected/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-kube-api-access-rzbxk\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.408389 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.408372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68d9g\" (UniqueName: \"kubernetes.io/projected/4a717388-605c-4d9d-8381-4bbf7fe371fb-kube-api-access-68d9g\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:09.497502 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.497471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:09.497662 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.497511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:21:09.497662 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.497631 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:21:09.497662 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.497646 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:21:09.497843 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.497672 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:21:09.497843 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.497678 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:21:41.497666367 +0000 UTC m=+66.212038433 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:21:09.497843 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.497685 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s97jp for pod openshift-network-diagnostics/network-check-target-nnp2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:21:09.497843 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.497751 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp podName:bd765cc1-22af-43e0-a1bf-88a1ec201341 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:41.497717253 +0000 UTC m=+66.212089337 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s97jp" (UniqueName: "kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp") pod "network-check-target-nnp2z" (UID: "bd765cc1-22af-43e0-a1bf-88a1ec201341") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:21:09.598615 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.598569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:09.598857 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.598739 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:21:09.598857 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.598802 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret podName:17c6e5a1-3d98-4126-b48d-b3e384ab3179 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:41.59878314 +0000 UTC m=+66.313155205 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret") pod "global-pull-secret-syncer-9r8xq" (UID: "17c6e5a1-3d98-4126-b48d-b3e384ab3179") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:21:09.900695 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.900659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:09.901460 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:09.900712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:09.901460 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.900839 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:09.901460 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.900858 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:09.901460 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.900903 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert podName:4a717388-605c-4d9d-8381-4bbf7fe371fb nodeName:}" failed. No retries permitted until 2026-04-20 19:21:10.900887605 +0000 UTC m=+35.615259691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert") pod "ingress-canary-7cqnx" (UID: "4a717388-605c-4d9d-8381-4bbf7fe371fb") : secret "canary-serving-cert" not found Apr 20 19:21:09.901460 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:09.900918 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls podName:00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd nodeName:}" failed. No retries permitted until 2026-04-20 19:21:10.900911243 +0000 UTC m=+35.615283308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls") pod "dns-default-qksj4" (UID: "00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd") : secret "dns-default-metrics-tls" not found Apr 20 19:21:10.831177 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.831150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:10.831435 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.831158 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:21:10.831435 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.831158 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:10.834635 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.834448 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:21:10.834635 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.834477 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:21:10.834635 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.834513 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-67k9h\"" Apr 20 19:21:10.834635 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.834522 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:21:10.834635 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.834549 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:21:10.834635 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.834483 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gxr6p\"" Apr 20 19:21:10.908490 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.908452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:10.908905 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:10.908510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:10.908905 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:10.908588 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:10.908905 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:10.908617 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:10.908905 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:10.908658 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert podName:4a717388-605c-4d9d-8381-4bbf7fe371fb nodeName:}" failed. No retries permitted until 2026-04-20 19:21:12.908636251 +0000 UTC m=+37.623008322 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert") pod "ingress-canary-7cqnx" (UID: "4a717388-605c-4d9d-8381-4bbf7fe371fb") : secret "canary-serving-cert" not found Apr 20 19:21:10.908905 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:10.908678 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls podName:00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd nodeName:}" failed. No retries permitted until 2026-04-20 19:21:12.908668027 +0000 UTC m=+37.623040094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls") pod "dns-default-qksj4" (UID: "00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd") : secret "dns-default-metrics-tls" not found Apr 20 19:21:11.985305 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:11.985129 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerStarted","Data":"3cabcbcc0ecd46bf97bdb5c8fa5ea8516db21b547bb239bc33285492146d00fa"} Apr 20 19:21:12.922967 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:12.922936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:12.923109 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:12.922975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:12.923109 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:12.923074 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:12.923192 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:12.923132 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert podName:4a717388-605c-4d9d-8381-4bbf7fe371fb nodeName:}" failed. No retries permitted until 2026-04-20 19:21:16.92311779 +0000 UTC m=+41.637489859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert") pod "ingress-canary-7cqnx" (UID: "4a717388-605c-4d9d-8381-4bbf7fe371fb") : secret "canary-serving-cert" not found Apr 20 19:21:12.923192 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:12.923079 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:12.923259 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:12.923208 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls podName:00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd nodeName:}" failed. No retries permitted until 2026-04-20 19:21:16.92319763 +0000 UTC m=+41.637569696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls") pod "dns-default-qksj4" (UID: "00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd") : secret "dns-default-metrics-tls" not found Apr 20 19:21:12.988977 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:12.988945 2577 generic.go:358] "Generic (PLEG): container finished" podID="94a9964b-f6a5-4b72-8989-1efbd67f430d" containerID="3cabcbcc0ecd46bf97bdb5c8fa5ea8516db21b547bb239bc33285492146d00fa" exitCode=0 Apr 20 19:21:12.989305 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:12.989004 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerDied","Data":"3cabcbcc0ecd46bf97bdb5c8fa5ea8516db21b547bb239bc33285492146d00fa"} Apr 20 19:21:13.993575 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:13.993540 2577 generic.go:358] "Generic (PLEG): container finished" podID="94a9964b-f6a5-4b72-8989-1efbd67f430d" containerID="3efeb848560f70d10d074a9d8fb88ce597df2d91ad8665f8ac648c2b46ee7237" exitCode=0 Apr 20 19:21:13.993951 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:13.993613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerDied","Data":"3efeb848560f70d10d074a9d8fb88ce597df2d91ad8665f8ac648c2b46ee7237"} Apr 20 19:21:14.998292 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:14.998252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k75h7" event={"ID":"94a9964b-f6a5-4b72-8989-1efbd67f430d","Type":"ContainerStarted","Data":"164249bd616503ebac25b8fb0046f89529381bf53a6dd067ad3b2a28fe978a31"} Apr 20 19:21:15.023331 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:15.023262 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k75h7" podStartSLOduration=5.914567342 podStartE2EDuration="39.023250123s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:20:38.605994502 +0000 UTC m=+3.320366571" lastFinishedPulling="2026-04-20 19:21:11.714677284 +0000 UTC m=+36.429049352" observedRunningTime="2026-04-20 19:21:15.021188645 +0000 UTC m=+39.735560734" watchObservedRunningTime="2026-04-20 19:21:15.023250123 +0000 UTC m=+39.737622218" Apr 20 19:21:16.951867 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:16.951833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:16.951867 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:16.951876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:16.952363 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:16.951982 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:16.952363 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:16.951986 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:16.952363 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:16.952032 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls podName:00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd nodeName:}" failed. No retries permitted until 2026-04-20 19:21:24.952019586 +0000 UTC m=+49.666391651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls") pod "dns-default-qksj4" (UID: "00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd") : secret "dns-default-metrics-tls" not found Apr 20 19:21:16.952363 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:16.952047 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert podName:4a717388-605c-4d9d-8381-4bbf7fe371fb nodeName:}" failed. No retries permitted until 2026-04-20 19:21:24.952038918 +0000 UTC m=+49.666410983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert") pod "ingress-canary-7cqnx" (UID: "4a717388-605c-4d9d-8381-4bbf7fe371fb") : secret "canary-serving-cert" not found Apr 20 19:21:19.918435 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:19.918401 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw"] Apr 20 19:21:19.922742 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:19.922706 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:19.925162 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:19.925135 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 19:21:19.925265 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:19.925204 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 19:21:19.926004 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:19.925986 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 19:21:19.926085 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:19.925993 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 19:21:19.931433 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:19.931415 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw"] Apr 20 19:21:20.071105 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.071080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5f8c988d-09d1-444c-874b-2239d18e6a4a-klusterlet-config\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.071251 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.071154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f8c988d-09d1-444c-874b-2239d18e6a4a-tmp\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.071251 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.071179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8tbp\" (UniqueName: \"kubernetes.io/projected/5f8c988d-09d1-444c-874b-2239d18e6a4a-kube-api-access-n8tbp\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.171868 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.171802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5f8c988d-09d1-444c-874b-2239d18e6a4a-klusterlet-config\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.171868 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.171861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f8c988d-09d1-444c-874b-2239d18e6a4a-tmp\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.171996 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.171882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8tbp\" (UniqueName: \"kubernetes.io/projected/5f8c988d-09d1-444c-874b-2239d18e6a4a-kube-api-access-n8tbp\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.172240 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.172220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f8c988d-09d1-444c-874b-2239d18e6a4a-tmp\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.175106 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.175090 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5f8c988d-09d1-444c-874b-2239d18e6a4a-klusterlet-config\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.180359 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.180329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8tbp\" (UniqueName: \"kubernetes.io/projected/5f8c988d-09d1-444c-874b-2239d18e6a4a-kube-api-access-n8tbp\") pod \"klusterlet-addon-workmgr-f4bb9977-vxjcw\" (UID: \"5f8c988d-09d1-444c-874b-2239d18e6a4a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.232347 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.232316 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:20.357113 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:20.357083 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw"] Apr 20 19:21:20.360364 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:21:20.360337 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f8c988d_09d1_444c_874b_2239d18e6a4a.slice/crio-c8652588f00956c1525f27dbddaf40900fae941e6317132bc055f2bed889f340 WatchSource:0}: Error finding container c8652588f00956c1525f27dbddaf40900fae941e6317132bc055f2bed889f340: Status 404 returned error can't find the container with id c8652588f00956c1525f27dbddaf40900fae941e6317132bc055f2bed889f340 Apr 20 19:21:21.013998 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:21.013964 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" event={"ID":"5f8c988d-09d1-444c-874b-2239d18e6a4a","Type":"ContainerStarted","Data":"c8652588f00956c1525f27dbddaf40900fae941e6317132bc055f2bed889f340"} Apr 20 19:21:25.008978 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:25.008929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:25.009404 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:25.009055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:25.009404 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:25.009104 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:25.009404 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:25.009152 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:25.009404 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:25.009186 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls podName:00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd nodeName:}" failed. No retries permitted until 2026-04-20 19:21:41.009165873 +0000 UTC m=+65.723537939 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls") pod "dns-default-qksj4" (UID: "00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd") : secret "dns-default-metrics-tls" not found Apr 20 19:21:25.009404 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:25.009205 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert podName:4a717388-605c-4d9d-8381-4bbf7fe371fb nodeName:}" failed. No retries permitted until 2026-04-20 19:21:41.009196559 +0000 UTC m=+65.723568624 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert") pod "ingress-canary-7cqnx" (UID: "4a717388-605c-4d9d-8381-4bbf7fe371fb") : secret "canary-serving-cert" not found Apr 20 19:21:26.023532 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:26.023502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" event={"ID":"5f8c988d-09d1-444c-874b-2239d18e6a4a","Type":"ContainerStarted","Data":"b5fc44e878efbdb4435f499482741e2dacef1e769fc3042dbde848afec70015c"} Apr 20 19:21:26.023937 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:26.023708 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:26.025054 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:26.025029 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:21:26.039781 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:26.039656 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" podStartSLOduration=1.7507305199999998 podStartE2EDuration="7.039641414s" podCreationTimestamp="2026-04-20 19:21:19 +0000 UTC" firstStartedPulling="2026-04-20 19:21:20.362093775 +0000 UTC m=+45.076465845" lastFinishedPulling="2026-04-20 19:21:25.651004672 +0000 UTC m=+50.365376739" observedRunningTime="2026-04-20 19:21:26.039126553 +0000 UTC m=+50.753498640" watchObservedRunningTime="2026-04-20 19:21:26.039641414 +0000 UTC m=+50.754013503" Apr 20 19:21:34.981483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:34.981457 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rhxmj" Apr 20 19:21:41.018426 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.018398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:21:41.018891 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.018435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:21:41.018891 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:41.018563 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:41.018891 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:41.018602 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:41.018891 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:41.018639 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert podName:4a717388-605c-4d9d-8381-4bbf7fe371fb nodeName:}" failed. No retries permitted until 2026-04-20 19:22:13.018617813 +0000 UTC m=+97.732989881 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert") pod "ingress-canary-7cqnx" (UID: "4a717388-605c-4d9d-8381-4bbf7fe371fb") : secret "canary-serving-cert" not found Apr 20 19:21:41.018891 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:41.018658 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls podName:00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd nodeName:}" failed. No retries permitted until 2026-04-20 19:22:13.018649237 +0000 UTC m=+97.733021303 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls") pod "dns-default-qksj4" (UID: "00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd") : secret "dns-default-metrics-tls" not found Apr 20 19:21:41.521929 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.521899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:41.522091 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.521943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:21:41.524514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.524497 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:21:41.524570 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.524549 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:21:41.532312 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:41.532295 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:21:41.532387 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:21:41.532376 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:22:45.532352267 +0000 UTC m=+130.246724333 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : secret "metrics-daemon-secret" not found Apr 20 19:21:41.534402 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.534387 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:21:41.544845 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.544821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s97jp\" (UniqueName: \"kubernetes.io/projected/bd765cc1-22af-43e0-a1bf-88a1ec201341-kube-api-access-s97jp\") pod \"network-check-target-nnp2z\" (UID: \"bd765cc1-22af-43e0-a1bf-88a1ec201341\") " pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:41.623038 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.623010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:41.625460 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.625444 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:21:41.636034 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.636008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/17c6e5a1-3d98-4126-b48d-b3e384ab3179-original-pull-secret\") pod \"global-pull-secret-syncer-9r8xq\" (UID: \"17c6e5a1-3d98-4126-b48d-b3e384ab3179\") " pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:41.742982 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.742957 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9r8xq" Apr 20 19:21:41.757073 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.757054 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-67k9h\"" Apr 20 19:21:41.765753 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.765716 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:41.881860 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.881798 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9r8xq"] Apr 20 19:21:41.886135 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:21:41.886108 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c6e5a1_3d98_4126_b48d_b3e384ab3179.slice/crio-8579c899bed5b0655286812acb7e7324038cba2695d4022bd45b819d2a440b93 WatchSource:0}: Error finding container 8579c899bed5b0655286812acb7e7324038cba2695d4022bd45b819d2a440b93: Status 404 returned error can't find the container with id 8579c899bed5b0655286812acb7e7324038cba2695d4022bd45b819d2a440b93 Apr 20 19:21:41.898823 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:41.898801 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nnp2z"] Apr 20 19:21:41.901448 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:21:41.901427 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd765cc1_22af_43e0_a1bf_88a1ec201341.slice/crio-7c2399775b29c6bc92ae5158c409f0b15320bf2e55782d75a9df14478c423bb8 WatchSource:0}: Error finding container 7c2399775b29c6bc92ae5158c409f0b15320bf2e55782d75a9df14478c423bb8: Status 404 returned error can't find the container with id 7c2399775b29c6bc92ae5158c409f0b15320bf2e55782d75a9df14478c423bb8 Apr 20 19:21:42.053994 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:42.053913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nnp2z" event={"ID":"bd765cc1-22af-43e0-a1bf-88a1ec201341","Type":"ContainerStarted","Data":"7c2399775b29c6bc92ae5158c409f0b15320bf2e55782d75a9df14478c423bb8"} Apr 20 19:21:42.054867 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:42.054841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9r8xq" event={"ID":"17c6e5a1-3d98-4126-b48d-b3e384ab3179","Type":"ContainerStarted","Data":"8579c899bed5b0655286812acb7e7324038cba2695d4022bd45b819d2a440b93"} Apr 20 19:21:47.066981 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:47.066949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nnp2z" event={"ID":"bd765cc1-22af-43e0-a1bf-88a1ec201341","Type":"ContainerStarted","Data":"fec144838c63f08144a528c5866597d48d65e60ac2451a09c08ecbd02e78dad4"} Apr 20 19:21:47.067474 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:47.067033 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:21:47.068203 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:47.068182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9r8xq" event={"ID":"17c6e5a1-3d98-4126-b48d-b3e384ab3179","Type":"ContainerStarted","Data":"b5f978a5abd763768e960d9bdbb6b5d6bb7b7b04d0350a8f8811b414d6200c96"} Apr 20 19:21:47.089040 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:47.089000 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nnp2z" podStartSLOduration=67.921368183 podStartE2EDuration="1m12.088991085s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:21:41.903231869 +0000 UTC m=+66.617603935" lastFinishedPulling="2026-04-20 19:21:46.070854771 +0000 UTC m=+70.785226837" observedRunningTime="2026-04-20 19:21:47.08823591 +0000 UTC m=+71.802607998" watchObservedRunningTime="2026-04-20 19:21:47.088991085 +0000 UTC m=+71.803363164" Apr 20 19:21:47.106889 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:21:47.106846 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9r8xq" podStartSLOduration=65.919981703 podStartE2EDuration="1m10.106832238s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="2026-04-20 19:21:41.88782774 +0000 UTC m=+66.602199806" lastFinishedPulling="2026-04-20 19:21:46.074678271 +0000 UTC m=+70.789050341" observedRunningTime="2026-04-20 19:21:47.106462668 +0000 UTC m=+71.820834755" watchObservedRunningTime="2026-04-20 19:21:47.106832238 +0000 UTC m=+71.821204329" Apr 20 19:22:13.029742 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:13.029677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:22:13.030153 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:13.029784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:22:13.030153 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:13.029827 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:22:13.030153 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:13.029863 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:22:13.030153 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:13.029895 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls podName:00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd nodeName:}" failed. No retries permitted until 2026-04-20 19:23:17.029878394 +0000 UTC m=+161.744250459 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls") pod "dns-default-qksj4" (UID: "00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd") : secret "dns-default-metrics-tls" not found Apr 20 19:22:13.030153 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:13.029910 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert podName:4a717388-605c-4d9d-8381-4bbf7fe371fb nodeName:}" failed. No retries permitted until 2026-04-20 19:23:17.029903776 +0000 UTC m=+161.744275842 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert") pod "ingress-canary-7cqnx" (UID: "4a717388-605c-4d9d-8381-4bbf7fe371fb") : secret "canary-serving-cert" not found Apr 20 19:22:18.072396 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:18.072369 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nnp2z" Apr 20 19:22:42.468505 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.468471 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk"] Apr 20 19:22:42.473068 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.473048 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.475245 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.475222 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 19:22:42.475357 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.475306 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:22:42.476199 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.476180 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-w9rpj\"" Apr 20 19:22:42.476251 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.476210 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:22:42.478329 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.478303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 19:22:42.481091 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.481070 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk"] Apr 20 19:22:42.588875 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.588853 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tjv47"] Apr 20 19:22:42.591755 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.591740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.593944 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.593927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 19:22:42.594067 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.594049 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:22:42.594135 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.594056 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 19:22:42.594211 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.594196 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:22:42.594274 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.594196 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gpbmz\"" Apr 20 19:22:42.602163 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.602140 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 19:22:42.604950 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.604931 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tjv47"] Apr 20 19:22:42.630292 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.630269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nndbx\" (UniqueName: \"kubernetes.io/projected/d9241b80-47a6-4cf4-8485-01b585082093-kube-api-access-nndbx\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.630391 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.630315 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d9241b80-47a6-4cf4-8485-01b585082093-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.630438 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.630401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.731347 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-tmp\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.731434 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nndbx\" (UniqueName: \"kubernetes.io/projected/d9241b80-47a6-4cf4-8485-01b585082093-kube-api-access-nndbx\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.731434 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-snapshots\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.731434 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d9241b80-47a6-4cf4-8485-01b585082093-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.731588 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.731588 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.731666 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:42.731621 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:42.731701 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhpq\" (UniqueName: \"kubernetes.io/projected/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-kube-api-access-5xhpq\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.731701 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:42.731677 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls podName:d9241b80-47a6-4cf4-8485-01b585082093 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:43.231659801 +0000 UTC m=+127.946031871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-524jk" (UID: "d9241b80-47a6-4cf4-8485-01b585082093") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:42.731837 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-service-ca-bundle\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.731837 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.731764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-serving-cert\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.732064 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.732046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d9241b80-47a6-4cf4-8485-01b585082093-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.741441 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.741423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nndbx\" (UniqueName: \"kubernetes.io/projected/d9241b80-47a6-4cf4-8485-01b585082093-kube-api-access-nndbx\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:42.832254 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-snapshots\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.832370 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.832409 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xhpq\" (UniqueName: \"kubernetes.io/projected/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-kube-api-access-5xhpq\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.832409 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-service-ca-bundle\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.832487 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-serving-cert\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.832487 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-tmp\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.832779 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-tmp\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.832779 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-snapshots\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.832938 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.832926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-service-ca-bundle\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.833290 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.833266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.834551 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.834528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-serving-cert\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.840339 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.840315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xhpq\" (UniqueName: \"kubernetes.io/projected/cf27dde3-1580-4f60-ad2f-abd6f261c5c1-kube-api-access-5xhpq\") pod \"insights-operator-585dfdc468-tjv47\" (UID: \"cf27dde3-1580-4f60-ad2f-abd6f261c5c1\") " pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:42.900389 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:42.900358 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-tjv47" Apr 20 19:22:43.011323 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:43.011268 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tjv47"] Apr 20 19:22:43.013983 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:22:43.013958 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf27dde3_1580_4f60_ad2f_abd6f261c5c1.slice/crio-f918975edc03e8fd10be24e472d07309b2fd3a7c9a48319a76220cd8f6bb04c4 WatchSource:0}: Error finding container f918975edc03e8fd10be24e472d07309b2fd3a7c9a48319a76220cd8f6bb04c4: Status 404 returned error can't find the container with id f918975edc03e8fd10be24e472d07309b2fd3a7c9a48319a76220cd8f6bb04c4 Apr 20 19:22:43.177102 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:43.177073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tjv47" event={"ID":"cf27dde3-1580-4f60-ad2f-abd6f261c5c1","Type":"ContainerStarted","Data":"f918975edc03e8fd10be24e472d07309b2fd3a7c9a48319a76220cd8f6bb04c4"} Apr 20 19:22:43.235519 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:43.235500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:43.235604 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:43.235591 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:43.235649 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:43.235641 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls podName:d9241b80-47a6-4cf4-8485-01b585082093 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:44.235627861 +0000 UTC m=+128.949999926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-524jk" (UID: "d9241b80-47a6-4cf4-8485-01b585082093") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:44.242027 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:44.241994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:44.242450 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:44.242110 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:44.242450 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:44.242166 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls podName:d9241b80-47a6-4cf4-8485-01b585082093 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:46.242153058 +0000 UTC m=+130.956525123 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-524jk" (UID: "d9241b80-47a6-4cf4-8485-01b585082093") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:45.551694 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:45.551664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:22:45.552022 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:45.551794 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:22:45.552022 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:45.551846 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs podName:39c06111-8b7a-4d9f-a3de-f5c655ac387d nodeName:}" failed. No retries permitted until 2026-04-20 19:24:47.551831419 +0000 UTC m=+252.266203485 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs") pod "network-metrics-daemon-tssws" (UID: "39c06111-8b7a-4d9f-a3de-f5c655ac387d") : secret "metrics-daemon-secret" not found Apr 20 19:22:46.183191 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:46.183152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tjv47" event={"ID":"cf27dde3-1580-4f60-ad2f-abd6f261c5c1","Type":"ContainerStarted","Data":"48c4953bca52a5b1c364fce39e1cc9dbca8dfd2e1047ba077c2bf33227715943"} Apr 20 19:22:46.198084 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:46.198017 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-tjv47" podStartSLOduration=1.817785067 podStartE2EDuration="4.197999998s" podCreationTimestamp="2026-04-20 19:22:42 +0000 UTC" firstStartedPulling="2026-04-20 19:22:43.015614105 +0000 UTC m=+127.729986170" lastFinishedPulling="2026-04-20 19:22:45.395829028 +0000 UTC m=+130.110201101" observedRunningTime="2026-04-20 19:22:46.197600753 +0000 UTC m=+130.911972843" watchObservedRunningTime="2026-04-20 19:22:46.197999998 +0000 UTC m=+130.912372086" Apr 20 19:22:46.257476 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:46.257444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:46.257650 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:46.257627 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:46.257769 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:46.257714 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls podName:d9241b80-47a6-4cf4-8485-01b585082093 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:50.257693644 +0000 UTC m=+134.972065712 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-524jk" (UID: "d9241b80-47a6-4cf4-8485-01b585082093") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:48.986424 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:48.986398 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lnjzz_bd768ad4-6493-4653-aa46-ff5c53a0532e/dns-node-resolver/0.log" Apr 20 19:22:49.786151 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:49.786124 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rnzz5_8f8d204a-6287-475e-8bb2-4e2081ea3788/node-ca/0.log" Apr 20 19:22:50.287821 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:50.287786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:50.288179 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:50.287926 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:50.288179 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:50.287993 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls podName:d9241b80-47a6-4cf4-8485-01b585082093 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:58.287974129 +0000 UTC m=+143.002346205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-524jk" (UID: "d9241b80-47a6-4cf4-8485-01b585082093") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:52.648666 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.648629 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c"] Apr 20 19:22:52.653210 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.653178 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:52.653708 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.653688 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt"] Apr 20 19:22:52.655600 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.655581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 19:22:52.655696 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.655583 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 19:22:52.656310 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.656289 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:22:52.656402 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.656287 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-7sx2h\"" Apr 20 19:22:52.656519 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.656503 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.658764 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.658743 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ttvx2\"" Apr 20 19:22:52.658957 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.658784 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 19:22:52.658957 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.658809 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 19:22:52.658957 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.658909 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c"] Apr 20 19:22:52.658957 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.658826 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 19:22:52.659191 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.659018 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:22:52.666797 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.666776 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt"] Apr 20 19:22:52.748591 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.748566 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc"] Apr 20 19:22:52.751316 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.751302 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:52.753620 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.753598 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 19:22:52.753801 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.753646 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 19:22:52.753801 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.753668 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7j2wc\"" Apr 20 19:22:52.753801 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.753670 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:22:52.754022 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.753892 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 19:22:52.758214 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.758196 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc"] Apr 20 19:22:52.803750 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.803713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-config\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.803834 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.803756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.803834 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.803774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpprf\" (UniqueName: \"kubernetes.io/projected/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-kube-api-access-fpprf\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.803834 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.803811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq522\" (UniqueName: \"kubernetes.io/projected/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-kube-api-access-gq522\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:52.803934 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.803873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:52.904769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.904691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-config\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.904769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.904741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.904769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.904759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpprf\" (UniqueName: \"kubernetes.io/projected/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-kube-api-access-fpprf\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.904947 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.904783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:52.904947 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.904808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:52.904947 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.904824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkt4g\" (UniqueName: \"kubernetes.io/projected/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-kube-api-access-wkt4g\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:52.904947 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.904844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq522\" (UniqueName: \"kubernetes.io/projected/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-kube-api-access-gq522\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:52.904947 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.904921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:52.905197 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:52.905048 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:22:52.905197 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:52.905128 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls podName:ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b nodeName:}" failed. No retries permitted until 2026-04-20 19:22:53.405107691 +0000 UTC m=+138.119479759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xq25c" (UID: "ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b") : secret "samples-operator-tls" not found Apr 20 19:22:52.905312 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.905261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-config\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.906900 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.906880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.912814 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.912795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpprf\" (UniqueName: \"kubernetes.io/projected/1a448f9a-4a4e-4017-8ec1-117e5a1efe2d-kube-api-access-fpprf\") pod \"service-ca-operator-d6fc45fc5-2sdjt\" (UID: \"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:52.913221 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.913199 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq522\" (UniqueName: \"kubernetes.io/projected/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-kube-api-access-gq522\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:52.970087 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:52.970065 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" Apr 20 19:22:53.006284 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.006254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:53.006428 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.006293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:53.006428 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.006317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkt4g\" (UniqueName: \"kubernetes.io/projected/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-kube-api-access-wkt4g\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:53.006956 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.006928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:53.008291 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.008268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:53.015227 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.015202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkt4g\" (UniqueName: \"kubernetes.io/projected/c19d268b-1a81-44d2-9b22-adc4e7ec01d0-kube-api-access-wkt4g\") pod \"kube-storage-version-migrator-operator-6769c5d45-t96hc\" (UID: \"c19d268b-1a81-44d2-9b22-adc4e7ec01d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:53.060017 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.059989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" Apr 20 19:22:53.078144 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.078116 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt"] Apr 20 19:22:53.080856 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:22:53.080826 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a448f9a_4a4e_4017_8ec1_117e5a1efe2d.slice/crio-09e7c0fe123d31abeeb7a9f7d610080e7cb257b3bce6e07c08f34989a40d30d9 WatchSource:0}: Error finding container 09e7c0fe123d31abeeb7a9f7d610080e7cb257b3bce6e07c08f34989a40d30d9: Status 404 returned error can't find the container with id 09e7c0fe123d31abeeb7a9f7d610080e7cb257b3bce6e07c08f34989a40d30d9 Apr 20 19:22:53.167771 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.167596 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc"] Apr 20 19:22:53.170026 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:22:53.169994 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19d268b_1a81_44d2_9b22_adc4e7ec01d0.slice/crio-fbe327680b97810ddce7c5051aab77ff04bc639d13f5a13dcc440a11582b382c WatchSource:0}: Error finding container fbe327680b97810ddce7c5051aab77ff04bc639d13f5a13dcc440a11582b382c: Status 404 returned error can't find the container with id fbe327680b97810ddce7c5051aab77ff04bc639d13f5a13dcc440a11582b382c Apr 20 19:22:53.197500 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.197475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" event={"ID":"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d","Type":"ContainerStarted","Data":"09e7c0fe123d31abeeb7a9f7d610080e7cb257b3bce6e07c08f34989a40d30d9"} Apr 20 19:22:53.198384 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.198363 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" event={"ID":"c19d268b-1a81-44d2-9b22-adc4e7ec01d0","Type":"ContainerStarted","Data":"fbe327680b97810ddce7c5051aab77ff04bc639d13f5a13dcc440a11582b382c"} Apr 20 19:22:53.409867 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:53.409839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:53.410003 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:53.409984 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:22:53.410054 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:53.410045 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls podName:ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b nodeName:}" failed. No retries permitted until 2026-04-20 19:22:54.41002987 +0000 UTC m=+139.124401937 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xq25c" (UID: "ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b") : secret "samples-operator-tls" not found Apr 20 19:22:54.418128 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:54.418086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:54.418539 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:54.418232 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:22:54.418539 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:54.418301 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls podName:ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b nodeName:}" failed. No retries permitted until 2026-04-20 19:22:56.418284954 +0000 UTC m=+141.132657024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xq25c" (UID: "ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b") : secret "samples-operator-tls" not found Apr 20 19:22:56.206319 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.206280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" event={"ID":"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d","Type":"ContainerStarted","Data":"61986ea3059a5559304d262b52937e692822bdbe6303294a7b41ee1ed1ba47b6"} Apr 20 19:22:56.207584 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.207550 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" event={"ID":"c19d268b-1a81-44d2-9b22-adc4e7ec01d0","Type":"ContainerStarted","Data":"11959146459fe01c43155614292cb5eb74d116f233129e310c0c4f142d24af82"} Apr 20 19:22:56.221080 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.221040 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" podStartSLOduration=1.8829747270000001 podStartE2EDuration="4.221029027s" podCreationTimestamp="2026-04-20 19:22:52 +0000 UTC" firstStartedPulling="2026-04-20 19:22:53.082699326 +0000 UTC m=+137.797071397" lastFinishedPulling="2026-04-20 19:22:55.420753628 +0000 UTC m=+140.135125697" observedRunningTime="2026-04-20 19:22:56.220450717 +0000 UTC m=+140.934822829" watchObservedRunningTime="2026-04-20 19:22:56.221029027 +0000 UTC m=+140.935401111" Apr 20 19:22:56.233760 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.233707 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" podStartSLOduration=1.982312605 podStartE2EDuration="4.233693441s" podCreationTimestamp="2026-04-20 19:22:52 +0000 UTC" firstStartedPulling="2026-04-20 19:22:53.171963726 +0000 UTC m=+137.886335793" lastFinishedPulling="2026-04-20 19:22:55.423344563 +0000 UTC m=+140.137716629" observedRunningTime="2026-04-20 19:22:56.233536636 +0000 UTC m=+140.947908728" watchObservedRunningTime="2026-04-20 19:22:56.233693441 +0000 UTC m=+140.948065529" Apr 20 19:22:56.433870 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.433837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:22:56.434015 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:56.433959 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:22:56.434059 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:56.434053 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls podName:ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b nodeName:}" failed. No retries permitted until 2026-04-20 19:23:00.434037624 +0000 UTC m=+145.148409694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xq25c" (UID: "ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b") : secret "samples-operator-tls" not found Apr 20 19:22:56.718395 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.718356 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57"] Apr 20 19:22:56.721597 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.721575 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" Apr 20 19:22:56.723775 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.723753 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 19:22:56.724585 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.724571 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 19:22:56.724585 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.724577 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-vfb67\"" Apr 20 19:22:56.730377 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.730358 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57"] Apr 20 19:22:56.837608 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.837569 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqbs9\" (UniqueName: \"kubernetes.io/projected/ebdf0ce7-38ad-46f6-a8af-493c326f2cfb-kube-api-access-nqbs9\") pod \"migrator-74bb7799d9-b5p57\" (UID: \"ebdf0ce7-38ad-46f6-a8af-493c326f2cfb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" Apr 20 19:22:56.938159 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.938127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqbs9\" (UniqueName: \"kubernetes.io/projected/ebdf0ce7-38ad-46f6-a8af-493c326f2cfb-kube-api-access-nqbs9\") pod \"migrator-74bb7799d9-b5p57\" (UID: \"ebdf0ce7-38ad-46f6-a8af-493c326f2cfb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" Apr 20 19:22:56.946287 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:56.946259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqbs9\" (UniqueName: \"kubernetes.io/projected/ebdf0ce7-38ad-46f6-a8af-493c326f2cfb-kube-api-access-nqbs9\") pod \"migrator-74bb7799d9-b5p57\" (UID: \"ebdf0ce7-38ad-46f6-a8af-493c326f2cfb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" Apr 20 19:22:57.030828 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:57.030801 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" Apr 20 19:22:57.142049 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:57.142022 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57"] Apr 20 19:22:57.145034 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:22:57.145008 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebdf0ce7_38ad_46f6_a8af_493c326f2cfb.slice/crio-99c7cda81a703bde1878bdbc06a805dea7bf63c63df755232ac4c682a85a130a WatchSource:0}: Error finding container 99c7cda81a703bde1878bdbc06a805dea7bf63c63df755232ac4c682a85a130a: Status 404 returned error can't find the container with id 99c7cda81a703bde1878bdbc06a805dea7bf63c63df755232ac4c682a85a130a Apr 20 19:22:57.214214 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:57.214180 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" event={"ID":"ebdf0ce7-38ad-46f6-a8af-493c326f2cfb","Type":"ContainerStarted","Data":"99c7cda81a703bde1878bdbc06a805dea7bf63c63df755232ac4c682a85a130a"} Apr 20 19:22:58.349465 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:58.349436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:22:58.349745 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:58.349568 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:58.349745 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:22:58.349626 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls podName:d9241b80-47a6-4cf4-8485-01b585082093 nodeName:}" failed. No retries permitted until 2026-04-20 19:23:14.349610362 +0000 UTC m=+159.063982428 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-524jk" (UID: "d9241b80-47a6-4cf4-8485-01b585082093") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:22:59.223049 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:59.223013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" event={"ID":"ebdf0ce7-38ad-46f6-a8af-493c326f2cfb","Type":"ContainerStarted","Data":"5b3eb9deba43ed7e9af336cb345debe33fb497902f5f9ca739a77d6ddd9c027a"} Apr 20 19:22:59.223049 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:59.223050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" event={"ID":"ebdf0ce7-38ad-46f6-a8af-493c326f2cfb","Type":"ContainerStarted","Data":"056fb52421d749ac19994b0f2f6830326f84e39e177e7f344791e11178ad50ba"} Apr 20 19:22:59.238611 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:22:59.238559 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b5p57" podStartSLOduration=2.184864207 podStartE2EDuration="3.238544789s" podCreationTimestamp="2026-04-20 19:22:56 +0000 UTC" firstStartedPulling="2026-04-20 19:22:57.146804892 +0000 UTC m=+141.861176958" lastFinishedPulling="2026-04-20 19:22:58.200485455 +0000 UTC m=+142.914857540" observedRunningTime="2026-04-20 19:22:59.237969653 +0000 UTC m=+143.952341741" watchObservedRunningTime="2026-04-20 19:22:59.238544789 +0000 UTC m=+143.952916876" Apr 20 19:23:00.468288 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:00.468253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:23:00.468667 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:00.468364 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:23:00.468667 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:00.468420 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls podName:ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b nodeName:}" failed. No retries permitted until 2026-04-20 19:23:08.468406392 +0000 UTC m=+153.182778458 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xq25c" (UID: "ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b") : secret "samples-operator-tls" not found Apr 20 19:23:08.522957 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:08.522923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:23:08.525154 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:08.525132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xq25c\" (UID: \"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:23:08.564156 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:08.564134 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" Apr 20 19:23:08.676988 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:08.674742 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c"] Apr 20 19:23:09.248086 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:09.248055 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" event={"ID":"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b","Type":"ContainerStarted","Data":"fcdbf27b4115469681a391de5f592267b29fe808a8467a9779a65d4e2e3bdd6c"} Apr 20 19:23:10.252274 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:10.252248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" event={"ID":"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b","Type":"ContainerStarted","Data":"c8cf08873bfd2b7f19688d588c6507344d969e55345ed7147023e7d0d1c7e5e5"} Apr 20 19:23:11.258081 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:11.258044 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" event={"ID":"ee2fcb53-7c47-46f6-9ba5-5a8c64e6514b","Type":"ContainerStarted","Data":"6928ed21bbd659d4ca29d58ee3ecb1bc1ba3f991c18fd317999085da84b67b88"} Apr 20 19:23:11.274877 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:11.274832 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xq25c" podStartSLOduration=17.827855121 podStartE2EDuration="19.274802244s" podCreationTimestamp="2026-04-20 19:22:52 +0000 UTC" firstStartedPulling="2026-04-20 19:23:08.714370629 +0000 UTC m=+153.428742695" lastFinishedPulling="2026-04-20 19:23:10.161317748 +0000 UTC m=+154.875689818" observedRunningTime="2026-04-20 19:23:11.273717105 +0000 UTC m=+155.988089190" watchObservedRunningTime="2026-04-20 19:23:11.274802244 +0000 UTC m=+155.989174375" Apr 20 19:23:12.174509 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:12.174473 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7cqnx" podUID="4a717388-605c-4d9d-8381-4bbf7fe371fb" Apr 20 19:23:12.196009 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:12.195977 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qksj4" podUID="00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd" Apr 20 19:23:12.260127 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:12.260101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:23:13.849452 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:13.849416 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tssws" podUID="39c06111-8b7a-4d9f-a3de-f5c655ac387d" Apr 20 19:23:14.367580 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:14.367545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:23:14.369825 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:14.369806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9241b80-47a6-4cf4-8485-01b585082093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-524jk\" (UID: \"d9241b80-47a6-4cf4-8485-01b585082093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:23:14.582248 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:14.582219 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" Apr 20 19:23:14.691844 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:14.691810 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk"] Apr 20 19:23:14.696787 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:23:14.696759 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9241b80_47a6_4cf4_8485_01b585082093.slice/crio-26d86a6b8a2c72d00ec59d01649a1c63e22862a1f6b15132cd0e8d4b013ffbfe WatchSource:0}: Error finding container 26d86a6b8a2c72d00ec59d01649a1c63e22862a1f6b15132cd0e8d4b013ffbfe: Status 404 returned error can't find the container with id 26d86a6b8a2c72d00ec59d01649a1c63e22862a1f6b15132cd0e8d4b013ffbfe Apr 20 19:23:15.268645 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:15.268604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" event={"ID":"d9241b80-47a6-4cf4-8485-01b585082093","Type":"ContainerStarted","Data":"26d86a6b8a2c72d00ec59d01649a1c63e22862a1f6b15132cd0e8d4b013ffbfe"} Apr 20 19:23:17.083074 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.083043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:23:17.083431 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.083098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:23:17.085404 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.085377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd-metrics-tls\") pod \"dns-default-qksj4\" (UID: \"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd\") " pod="openshift-dns/dns-default-qksj4" Apr 20 19:23:17.085497 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.085482 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a717388-605c-4d9d-8381-4bbf7fe371fb-cert\") pod \"ingress-canary-7cqnx\" (UID: \"4a717388-605c-4d9d-8381-4bbf7fe371fb\") " pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:23:17.274922 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.274886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" event={"ID":"d9241b80-47a6-4cf4-8485-01b585082093","Type":"ContainerStarted","Data":"1d228935892d7bf28b7255ee2e4f1700405b9c649b2c44f2fea6cb35feb2d830"} Apr 20 19:23:17.291705 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.291655 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-524jk" podStartSLOduration=33.708619686 podStartE2EDuration="35.291639706s" podCreationTimestamp="2026-04-20 19:22:42 +0000 UTC" firstStartedPulling="2026-04-20 19:23:14.698525638 +0000 UTC m=+159.412897707" lastFinishedPulling="2026-04-20 19:23:16.281545644 +0000 UTC m=+160.995917727" observedRunningTime="2026-04-20 19:23:17.291238918 +0000 UTC m=+162.005611005" watchObservedRunningTime="2026-04-20 19:23:17.291639706 +0000 UTC m=+162.006011794" Apr 20 19:23:17.363121 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.363055 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ltv6t\"" Apr 20 19:23:17.370874 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.370856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7cqnx" Apr 20 19:23:17.507980 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:17.507950 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7cqnx"] Apr 20 19:23:17.510798 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:23:17.510774 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a717388_605c_4d9d_8381_4bbf7fe371fb.slice/crio-28556e1e9a3f8e5485b2202346f22177a87a9a960dc33b020b3e4c44ddcb1a56 WatchSource:0}: Error finding container 28556e1e9a3f8e5485b2202346f22177a87a9a960dc33b020b3e4c44ddcb1a56: Status 404 returned error can't find the container with id 28556e1e9a3f8e5485b2202346f22177a87a9a960dc33b020b3e4c44ddcb1a56 Apr 20 19:23:18.278378 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.278336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7cqnx" event={"ID":"4a717388-605c-4d9d-8381-4bbf7fe371fb","Type":"ContainerStarted","Data":"28556e1e9a3f8e5485b2202346f22177a87a9a960dc33b020b3e4c44ddcb1a56"} Apr 20 19:23:18.736252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.734284 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8pzkw"] Apr 20 19:23:18.741375 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.741348 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.744037 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.744012 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:23:18.744963 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.744901 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:23:18.744963 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.744909 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-d2rzs\"" Apr 20 19:23:18.750450 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.749266 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8pzkw"] Apr 20 19:23:18.794819 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.794794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff1dc025-5bbe-4675-8b44-c791098ecbb6-data-volume\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.794952 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.794824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff1dc025-5bbe-4675-8b44-c791098ecbb6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.794952 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.794877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjjrm\" (UniqueName: \"kubernetes.io/projected/ff1dc025-5bbe-4675-8b44-c791098ecbb6-kube-api-access-jjjrm\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.795046 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.794956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff1dc025-5bbe-4675-8b44-c791098ecbb6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.795046 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.794996 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff1dc025-5bbe-4675-8b44-c791098ecbb6-crio-socket\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.818635 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.818609 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85d95c7f6-4v9jj"] Apr 20 19:23:18.821761 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.821742 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.824775 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.824749 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mf28l\"" Apr 20 19:23:18.824947 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.824927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 19:23:18.825094 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.825074 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 19:23:18.826237 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.826212 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 19:23:18.832084 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.832063 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 19:23:18.844542 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.844518 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85d95c7f6-4v9jj"] Apr 20 19:23:18.897307 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897273 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-registry-tls\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.897471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjjrm\" (UniqueName: \"kubernetes.io/projected/ff1dc025-5bbe-4675-8b44-c791098ecbb6-kube-api-access-jjjrm\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.897471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-bound-sa-token\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.897471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff1dc025-5bbe-4675-8b44-c791098ecbb6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.897471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7eefee4f-85f2-4490-9054-b8484ab1a66f-trusted-ca\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.897471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff1dc025-5bbe-4675-8b44-c791098ecbb6-crio-socket\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.897471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7eefee4f-85f2-4490-9054-b8484ab1a66f-registry-certificates\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.897471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897468 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7eefee4f-85f2-4490-9054-b8484ab1a66f-installation-pull-secrets\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.897854 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbg22\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-kube-api-access-gbg22\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.897854 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7eefee4f-85f2-4490-9054-b8484ab1a66f-image-registry-private-configuration\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.897854 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff1dc025-5bbe-4675-8b44-c791098ecbb6-data-volume\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.897854 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff1dc025-5bbe-4675-8b44-c791098ecbb6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.897854 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.897652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7eefee4f-85f2-4490-9054-b8484ab1a66f-ca-trust-extracted\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.898420 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.898397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ff1dc025-5bbe-4675-8b44-c791098ecbb6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.898515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.898486 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ff1dc025-5bbe-4675-8b44-c791098ecbb6-crio-socket\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.898716 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.898698 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ff1dc025-5bbe-4675-8b44-c791098ecbb6-data-volume\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.900862 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.900836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ff1dc025-5bbe-4675-8b44-c791098ecbb6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.912545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.912526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjjrm\" (UniqueName: \"kubernetes.io/projected/ff1dc025-5bbe-4675-8b44-c791098ecbb6-kube-api-access-jjjrm\") pod \"insights-runtime-extractor-8pzkw\" (UID: \"ff1dc025-5bbe-4675-8b44-c791098ecbb6\") " pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:18.998928 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.998900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7eefee4f-85f2-4490-9054-b8484ab1a66f-ca-trust-extracted\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.999058 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.998953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-registry-tls\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.999058 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.998981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-bound-sa-token\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.999058 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.999040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7eefee4f-85f2-4490-9054-b8484ab1a66f-trusted-ca\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.999294 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.999071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7eefee4f-85f2-4490-9054-b8484ab1a66f-registry-certificates\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.999294 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.999095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7eefee4f-85f2-4490-9054-b8484ab1a66f-installation-pull-secrets\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.999294 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.999126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbg22\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-kube-api-access-gbg22\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.999294 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.999174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7eefee4f-85f2-4490-9054-b8484ab1a66f-image-registry-private-configuration\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:18.999516 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:18.999311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7eefee4f-85f2-4490-9054-b8484ab1a66f-ca-trust-extracted\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.000213 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.000141 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7eefee4f-85f2-4490-9054-b8484ab1a66f-registry-certificates\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.000492 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.000471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7eefee4f-85f2-4490-9054-b8484ab1a66f-trusted-ca\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.001995 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.001973 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7eefee4f-85f2-4490-9054-b8484ab1a66f-installation-pull-secrets\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.002079 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.002059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-registry-tls\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.002247 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.002228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7eefee4f-85f2-4490-9054-b8484ab1a66f-image-registry-private-configuration\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.006976 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.006955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbg22\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-kube-api-access-gbg22\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.007282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.007267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7eefee4f-85f2-4490-9054-b8484ab1a66f-bound-sa-token\") pod \"image-registry-85d95c7f6-4v9jj\" (UID: \"7eefee4f-85f2-4490-9054-b8484ab1a66f\") " pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.053965 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.053931 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8pzkw" Apr 20 19:23:19.133164 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.133073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:19.179342 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.179292 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8pzkw"] Apr 20 19:23:19.184088 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:23:19.184024 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff1dc025_5bbe_4675_8b44_c791098ecbb6.slice/crio-0af0a51bb96edea62edc2fa237e79048a26c073bfc579755aa83e8c62b2f41de WatchSource:0}: Error finding container 0af0a51bb96edea62edc2fa237e79048a26c073bfc579755aa83e8c62b2f41de: Status 404 returned error can't find the container with id 0af0a51bb96edea62edc2fa237e79048a26c073bfc579755aa83e8c62b2f41de Apr 20 19:23:19.256535 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.256474 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85d95c7f6-4v9jj"] Apr 20 19:23:19.259580 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:23:19.259549 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eefee4f_85f2_4490_9054_b8484ab1a66f.slice/crio-44aff7cac9c13960b679a2e5838ef0f32c1d0e4e2969df15e9234e4a4fd6e68c WatchSource:0}: Error finding container 44aff7cac9c13960b679a2e5838ef0f32c1d0e4e2969df15e9234e4a4fd6e68c: Status 404 returned error can't find the container with id 44aff7cac9c13960b679a2e5838ef0f32c1d0e4e2969df15e9234e4a4fd6e68c Apr 20 19:23:19.281975 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.281948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" event={"ID":"7eefee4f-85f2-4490-9054-b8484ab1a66f","Type":"ContainerStarted","Data":"44aff7cac9c13960b679a2e5838ef0f32c1d0e4e2969df15e9234e4a4fd6e68c"} Apr 20 19:23:19.283526 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.283504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7cqnx" event={"ID":"4a717388-605c-4d9d-8381-4bbf7fe371fb","Type":"ContainerStarted","Data":"df5aeb9af395da1685b3fd2b0621680f200ce8d99a2616fec7074ac9f940e79c"} Apr 20 19:23:19.284626 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.284606 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8pzkw" event={"ID":"ff1dc025-5bbe-4675-8b44-c791098ecbb6","Type":"ContainerStarted","Data":"7a9f0fe782734855d6588b3227f928ea07bfb401ca48684a396990c9912c01c7"} Apr 20 19:23:19.284704 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.284629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8pzkw" event={"ID":"ff1dc025-5bbe-4675-8b44-c791098ecbb6","Type":"ContainerStarted","Data":"0af0a51bb96edea62edc2fa237e79048a26c073bfc579755aa83e8c62b2f41de"} Apr 20 19:23:19.302499 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:19.302462 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7cqnx" podStartSLOduration=128.87120283 podStartE2EDuration="2m10.302449985s" podCreationTimestamp="2026-04-20 19:21:09 +0000 UTC" firstStartedPulling="2026-04-20 19:23:17.512660516 +0000 UTC m=+162.227032582" lastFinishedPulling="2026-04-20 19:23:18.94390767 +0000 UTC m=+163.658279737" observedRunningTime="2026-04-20 19:23:19.300702972 +0000 UTC m=+164.015075059" watchObservedRunningTime="2026-04-20 19:23:19.302449985 +0000 UTC m=+164.016822073" Apr 20 19:23:20.289021 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:20.288990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" event={"ID":"7eefee4f-85f2-4490-9054-b8484ab1a66f","Type":"ContainerStarted","Data":"e5fb4eb0f0fe46dd7730f6c639215b44aabd571a6df420ef06be2ec6bfb7b905"} Apr 20 19:23:20.289508 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:20.289063 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:20.290580 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:20.290558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8pzkw" event={"ID":"ff1dc025-5bbe-4675-8b44-c791098ecbb6","Type":"ContainerStarted","Data":"621f1593d84a6c0879bdc5bc2375c2d3fb1ec6145930aa9ec72aa1d9b8bd2532"} Apr 20 19:23:20.309644 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:20.309609 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" podStartSLOduration=2.30959858 podStartE2EDuration="2.30959858s" podCreationTimestamp="2026-04-20 19:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:23:20.309162761 +0000 UTC m=+165.023534848" watchObservedRunningTime="2026-04-20 19:23:20.30959858 +0000 UTC m=+165.023970668" Apr 20 19:23:22.299970 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:22.299934 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8pzkw" event={"ID":"ff1dc025-5bbe-4675-8b44-c791098ecbb6","Type":"ContainerStarted","Data":"7b1fbaec5e2657685a7a199b007b6d188bae2df98d45cb04e363840bf5849623"} Apr 20 19:23:22.318015 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:22.317969 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8pzkw" podStartSLOduration=1.973394565 podStartE2EDuration="4.31795638s" podCreationTimestamp="2026-04-20 19:23:18 +0000 UTC" firstStartedPulling="2026-04-20 19:23:19.242401633 +0000 UTC m=+163.956773702" lastFinishedPulling="2026-04-20 19:23:21.586963448 +0000 UTC m=+166.301335517" observedRunningTime="2026-04-20 19:23:22.316417085 +0000 UTC m=+167.030789177" watchObservedRunningTime="2026-04-20 19:23:22.31795638 +0000 UTC m=+167.032328467" Apr 20 19:23:23.830327 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:23.830293 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qksj4" Apr 20 19:23:23.832993 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:23.832971 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qtv7w\"" Apr 20 19:23:23.841606 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:23.841589 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qksj4" Apr 20 19:23:23.953646 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:23.953583 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qksj4"] Apr 20 19:23:23.956007 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:23:23.955978 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ad0cd4_ddc3_4fb0_8b71_18eaf617c1cd.slice/crio-4c00713241049d239296ba0c66060936a529513d735ae8d17edeef7a1b852dc5 WatchSource:0}: Error finding container 4c00713241049d239296ba0c66060936a529513d735ae8d17edeef7a1b852dc5: Status 404 returned error can't find the container with id 4c00713241049d239296ba0c66060936a529513d735ae8d17edeef7a1b852dc5 Apr 20 19:23:24.306187 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:24.306157 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qksj4" event={"ID":"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd","Type":"ContainerStarted","Data":"4c00713241049d239296ba0c66060936a529513d735ae8d17edeef7a1b852dc5"} Apr 20 19:23:25.309356 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:25.309323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qksj4" event={"ID":"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd","Type":"ContainerStarted","Data":"cefc71667645e14fc607db10dc3b8d45205ff8f77d9b2c3d3f1d5bd74d194939"} Apr 20 19:23:26.024486 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:26.024434 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" podUID="5f8c988d-09d1-444c-874b-2239d18e6a4a" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/readyz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 20 19:23:26.313543 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:26.313515 2577 generic.go:358] "Generic (PLEG): container finished" podID="5f8c988d-09d1-444c-874b-2239d18e6a4a" containerID="b5fc44e878efbdb4435f499482741e2dacef1e769fc3042dbde848afec70015c" exitCode=1 Apr 20 19:23:26.313935 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:26.313590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" event={"ID":"5f8c988d-09d1-444c-874b-2239d18e6a4a","Type":"ContainerDied","Data":"b5fc44e878efbdb4435f499482741e2dacef1e769fc3042dbde848afec70015c"} Apr 20 19:23:26.313990 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:26.313946 2577 scope.go:117] "RemoveContainer" containerID="b5fc44e878efbdb4435f499482741e2dacef1e769fc3042dbde848afec70015c" Apr 20 19:23:26.315068 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:26.315044 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qksj4" event={"ID":"00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd","Type":"ContainerStarted","Data":"62a0ed8d8fc8ea5721dd5557888f13c21503101543933459d8b8be5e02763ac5"} Apr 20 19:23:26.315227 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:26.315210 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qksj4" Apr 20 19:23:26.344935 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:26.344897 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qksj4" podStartSLOduration=136.178972258 podStartE2EDuration="2m17.344880993s" podCreationTimestamp="2026-04-20 19:21:09 +0000 UTC" firstStartedPulling="2026-04-20 19:23:23.960317931 +0000 UTC m=+168.674690009" lastFinishedPulling="2026-04-20 19:23:25.126226679 +0000 UTC m=+169.840598744" observedRunningTime="2026-04-20 19:23:26.344688026 +0000 UTC m=+171.059060105" watchObservedRunningTime="2026-04-20 19:23:26.344880993 +0000 UTC m=+171.059253083" Apr 20 19:23:27.202923 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.202891 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vxd2g"] Apr 20 19:23:27.205927 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.205911 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.208281 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.208262 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kfq2q\"" Apr 20 19:23:27.208493 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.208479 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:23:27.208666 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.208654 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:23:27.209162 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.209145 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:23:27.209219 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.209191 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:23:27.257201 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-root\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.257299 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-accelerators-collector-config\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.257299 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-sys\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.257299 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a07410c-4c3a-40a9-955d-2fc040fffc3a-metrics-client-ca\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.257415 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-wtmp\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.257415 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257370 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4nk9\" (UniqueName: \"kubernetes.io/projected/4a07410c-4c3a-40a9-955d-2fc040fffc3a-kube-api-access-x4nk9\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.257515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.257515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-tls\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.257515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.257503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-textfile\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.319245 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.319220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" event={"ID":"5f8c988d-09d1-444c-874b-2239d18e6a4a","Type":"ContainerStarted","Data":"70d6f3a99bce67649110947f672970078d1c59c8d99f2075468c8db0d14288da"} Apr 20 19:23:27.319593 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.319580 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:23:27.320134 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.320118 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4bb9977-vxjcw" Apr 20 19:23:27.358084 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-accelerators-collector-config\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358201 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-sys\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358201 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358146 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a07410c-4c3a-40a9-955d-2fc040fffc3a-metrics-client-ca\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-wtmp\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358365 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nk9\" (UniqueName: \"kubernetes.io/projected/4a07410c-4c3a-40a9-955d-2fc040fffc3a-kube-api-access-x4nk9\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358422 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-sys\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358422 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358525 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-tls\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358525 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:27.358513 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:23:27.358624 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-wtmp\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358624 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-textfile\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358624 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:27.358596 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-tls podName:4a07410c-4c3a-40a9-955d-2fc040fffc3a nodeName:}" failed. No retries permitted until 2026-04-20 19:23:27.858576991 +0000 UTC m=+172.572949058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-tls") pod "node-exporter-vxd2g" (UID: "4a07410c-4c3a-40a9-955d-2fc040fffc3a") : secret "node-exporter-tls" not found Apr 20 19:23:27.358812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-root\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a07410c-4c3a-40a9-955d-2fc040fffc3a-root\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358812 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-accelerators-collector-config\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.358917 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.358819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-textfile\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.359077 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.359060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a07410c-4c3a-40a9-955d-2fc040fffc3a-metrics-client-ca\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.360648 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.360631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.368150 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.368128 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nk9\" (UniqueName: \"kubernetes.io/projected/4a07410c-4c3a-40a9-955d-2fc040fffc3a-kube-api-access-x4nk9\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.862348 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.862317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-tls\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:27.864457 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:27.864440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a07410c-4c3a-40a9-955d-2fc040fffc3a-node-exporter-tls\") pod \"node-exporter-vxd2g\" (UID: \"4a07410c-4c3a-40a9-955d-2fc040fffc3a\") " pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:28.114748 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:28.114662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vxd2g" Apr 20 19:23:28.122761 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:23:28.122736 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a07410c_4c3a_40a9_955d_2fc040fffc3a.slice/crio-d9dc18bd5db1d3dd7e966d124537a854392575212f6e31f0906fbf8f1b73a5f8 WatchSource:0}: Error finding container d9dc18bd5db1d3dd7e966d124537a854392575212f6e31f0906fbf8f1b73a5f8: Status 404 returned error can't find the container with id d9dc18bd5db1d3dd7e966d124537a854392575212f6e31f0906fbf8f1b73a5f8 Apr 20 19:23:28.325882 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:28.325851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vxd2g" event={"ID":"4a07410c-4c3a-40a9-955d-2fc040fffc3a","Type":"ContainerStarted","Data":"d9dc18bd5db1d3dd7e966d124537a854392575212f6e31f0906fbf8f1b73a5f8"} Apr 20 19:23:28.831012 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:28.830976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:23:29.329429 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:29.329398 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a07410c-4c3a-40a9-955d-2fc040fffc3a" containerID="aff3547179d5ca4869e7bd5f56c5e735991a12c7d308adedea7308ccdf682c05" exitCode=0 Apr 20 19:23:29.329839 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:29.329483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vxd2g" event={"ID":"4a07410c-4c3a-40a9-955d-2fc040fffc3a","Type":"ContainerDied","Data":"aff3547179d5ca4869e7bd5f56c5e735991a12c7d308adedea7308ccdf682c05"} Apr 20 19:23:30.334164 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:30.334126 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vxd2g" event={"ID":"4a07410c-4c3a-40a9-955d-2fc040fffc3a","Type":"ContainerStarted","Data":"79dc7ba359dfe18b9e63eeffd45e1f33f8cad3d5d2b2e2badcd2b13ead26119c"} Apr 20 19:23:30.334521 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:30.334170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vxd2g" event={"ID":"4a07410c-4c3a-40a9-955d-2fc040fffc3a","Type":"ContainerStarted","Data":"52c0d07356136c1f038905f01eece8406a2a7dc292d96f06e3d37da7b4d4a72c"} Apr 20 19:23:30.353112 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:30.353073 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vxd2g" podStartSLOduration=2.574872919 podStartE2EDuration="3.353060122s" podCreationTimestamp="2026-04-20 19:23:27 +0000 UTC" firstStartedPulling="2026-04-20 19:23:28.124830611 +0000 UTC m=+172.839202680" lastFinishedPulling="2026-04-20 19:23:28.903017806 +0000 UTC m=+173.617389883" observedRunningTime="2026-04-20 19:23:30.353008447 +0000 UTC m=+175.067380526" watchObservedRunningTime="2026-04-20 19:23:30.353060122 +0000 UTC m=+175.067432211" Apr 20 19:23:31.943159 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:31.943127 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf"] Apr 20 19:23:31.948081 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:31.948063 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" Apr 20 19:23:31.950419 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:31.950398 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 19:23:31.950535 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:31.950440 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-brv87\"" Apr 20 19:23:31.957369 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:31.957349 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf"] Apr 20 19:23:31.995735 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:31.995704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/060b5a16-fbe2-4e36-b398-85d8ac1178a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-v8rqf\" (UID: \"060b5a16-fbe2-4e36-b398-85d8ac1178a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" Apr 20 19:23:32.096704 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:32.096681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/060b5a16-fbe2-4e36-b398-85d8ac1178a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-v8rqf\" (UID: \"060b5a16-fbe2-4e36-b398-85d8ac1178a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" Apr 20 19:23:32.096811 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:32.096796 2577 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 19:23:32.096861 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:23:32.096852 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/060b5a16-fbe2-4e36-b398-85d8ac1178a9-monitoring-plugin-cert podName:060b5a16-fbe2-4e36-b398-85d8ac1178a9 nodeName:}" failed. No retries permitted until 2026-04-20 19:23:32.596837457 +0000 UTC m=+177.311209523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/060b5a16-fbe2-4e36-b398-85d8ac1178a9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-v8rqf" (UID: "060b5a16-fbe2-4e36-b398-85d8ac1178a9") : secret "monitoring-plugin-cert" not found Apr 20 19:23:32.600822 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:32.600791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/060b5a16-fbe2-4e36-b398-85d8ac1178a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-v8rqf\" (UID: \"060b5a16-fbe2-4e36-b398-85d8ac1178a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" Apr 20 19:23:32.603136 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:32.603106 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/060b5a16-fbe2-4e36-b398-85d8ac1178a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-v8rqf\" (UID: \"060b5a16-fbe2-4e36-b398-85d8ac1178a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" Apr 20 19:23:32.857193 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:32.857109 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" Apr 20 19:23:32.973472 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:32.973441 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf"] Apr 20 19:23:32.976451 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:23:32.976427 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod060b5a16_fbe2_4e36_b398_85d8ac1178a9.slice/crio-24a8566d776a33733ebb6ca211f5ca2eaba575c08a169664bf4d59791c7115f5 WatchSource:0}: Error finding container 24a8566d776a33733ebb6ca211f5ca2eaba575c08a169664bf4d59791c7115f5: Status 404 returned error can't find the container with id 24a8566d776a33733ebb6ca211f5ca2eaba575c08a169664bf4d59791c7115f5 Apr 20 19:23:33.342811 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.342781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" event={"ID":"060b5a16-fbe2-4e36-b398-85d8ac1178a9","Type":"ContainerStarted","Data":"24a8566d776a33733ebb6ca211f5ca2eaba575c08a169664bf4d59791c7115f5"} Apr 20 19:23:33.391539 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.391513 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:23:33.396881 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.396863 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.400264 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.399524 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 19:23:33.400264 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.399831 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 19:23:33.401279 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.401045 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 19:23:33.401279 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.401088 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 19:23:33.401783 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.401761 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:23:33.401783 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.401778 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 19:23:33.402440 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.402097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-71uca88g07p96\"" Apr 20 19:23:33.402440 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.402179 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 19:23:33.402440 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.402429 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 19:23:33.402634 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.402493 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 19:23:33.402634 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.402521 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-652dp\"" Apr 20 19:23:33.402949 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.402930 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 19:23:33.403312 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.403248 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 19:23:33.403312 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.403299 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 19:23:33.405803 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.405781 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 19:23:33.412018 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.412000 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:23:33.507390 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507652 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507652 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507652 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507652 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507874 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-web-config\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507874 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507874 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507768 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507874 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.507874 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.508057 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-config\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.508057 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507907 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.508057 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-config-out\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.508057 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507968 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkzzm\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-kube-api-access-zkzzm\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.508057 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.507995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.608963 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.608872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.608963 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.608921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.608963 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.608951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609036 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-web-config\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.609747 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.609265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-config\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.610052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.610093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.610111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.610173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-config-out\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.610204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkzzm\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-kube-api-access-zkzzm\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.610230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.610263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.610970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.612985 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.612946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.613227 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.613200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.613326 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.613276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.613326 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.613282 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-config-out\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.613759 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.613719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.614261 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.614220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.614354 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.614303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-web-config\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.614354 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.614335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.614841 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.614794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.614990 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.614947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-config\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.615567 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.615540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.615840 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.615818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.616335 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.616315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.616944 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.616924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.624337 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.624319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkzzm\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-kube-api-access-zkzzm\") pod \"prometheus-k8s-0\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.710876 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.710844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:23:33.846431 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:33.846398 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:23:33.849332 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:23:33.849308 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaba5cf0d_edd0_466d_a46b_89dbd4806065.slice/crio-607359f1a851b6f875a1cd0a4f846f89e783a2ae9db09f3e6b9bf085040ad35e WatchSource:0}: Error finding container 607359f1a851b6f875a1cd0a4f846f89e783a2ae9db09f3e6b9bf085040ad35e: Status 404 returned error can't find the container with id 607359f1a851b6f875a1cd0a4f846f89e783a2ae9db09f3e6b9bf085040ad35e Apr 20 19:23:34.346356 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:34.346329 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerStarted","Data":"607359f1a851b6f875a1cd0a4f846f89e783a2ae9db09f3e6b9bf085040ad35e"} Apr 20 19:23:35.350227 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:35.350194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" event={"ID":"060b5a16-fbe2-4e36-b398-85d8ac1178a9","Type":"ContainerStarted","Data":"b74df60253408557a0837045824ddb73fb80f772ef131cdbaba2cd15641ed43a"} Apr 20 19:23:35.350717 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:35.350438 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" Apr 20 19:23:35.351694 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:35.351666 2577 generic.go:358] "Generic (PLEG): container finished" podID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" exitCode=0 Apr 20 19:23:35.351818 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:35.351746 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerDied","Data":"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43"} Apr 20 19:23:35.355543 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:35.355522 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" Apr 20 19:23:35.365276 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:35.365235 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v8rqf" podStartSLOduration=3.02198989 podStartE2EDuration="4.36522208s" podCreationTimestamp="2026-04-20 19:23:31 +0000 UTC" firstStartedPulling="2026-04-20 19:23:32.978269895 +0000 UTC m=+177.692641964" lastFinishedPulling="2026-04-20 19:23:34.321502083 +0000 UTC m=+179.035874154" observedRunningTime="2026-04-20 19:23:35.364111058 +0000 UTC m=+180.078483147" watchObservedRunningTime="2026-04-20 19:23:35.36522208 +0000 UTC m=+180.079594215" Apr 20 19:23:36.321927 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:36.321898 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qksj4" Apr 20 19:23:39.364772 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:39.364721 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerStarted","Data":"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362"} Apr 20 19:23:39.364772 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:39.364773 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerStarted","Data":"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552"} Apr 20 19:23:41.299028 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:41.299002 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-85d95c7f6-4v9jj" Apr 20 19:23:41.374189 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:41.374159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerStarted","Data":"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392"} Apr 20 19:23:41.374189 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:41.374191 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerStarted","Data":"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870"} Apr 20 19:23:41.374189 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:41.374200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerStarted","Data":"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6"} Apr 20 19:23:41.374189 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:41.374208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerStarted","Data":"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50"} Apr 20 19:23:41.402300 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:41.402217 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.7816627889999999 podStartE2EDuration="8.402197274s" podCreationTimestamp="2026-04-20 19:23:33 +0000 UTC" firstStartedPulling="2026-04-20 19:23:33.851623073 +0000 UTC m=+178.565995138" lastFinishedPulling="2026-04-20 19:23:40.472157548 +0000 UTC m=+185.186529623" observedRunningTime="2026-04-20 19:23:41.400118767 +0000 UTC m=+186.114490854" watchObservedRunningTime="2026-04-20 19:23:41.402197274 +0000 UTC m=+186.116569363" Apr 20 19:23:43.710970 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:23:43.710939 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:01.427965 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:01.427935 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf27dde3-1580-4f60-ad2f-abd6f261c5c1" containerID="48c4953bca52a5b1c364fce39e1cc9dbca8dfd2e1047ba077c2bf33227715943" exitCode=0 Apr 20 19:24:01.428359 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:01.427978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tjv47" event={"ID":"cf27dde3-1580-4f60-ad2f-abd6f261c5c1","Type":"ContainerDied","Data":"48c4953bca52a5b1c364fce39e1cc9dbca8dfd2e1047ba077c2bf33227715943"} Apr 20 19:24:01.428359 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:01.428255 2577 scope.go:117] "RemoveContainer" containerID="48c4953bca52a5b1c364fce39e1cc9dbca8dfd2e1047ba077c2bf33227715943" Apr 20 19:24:02.432082 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:02.432049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tjv47" event={"ID":"cf27dde3-1580-4f60-ad2f-abd6f261c5c1","Type":"ContainerStarted","Data":"19284e97eac4a1a153ec2d0483ce6cf944277babdae06dd463265feb52ed3e8c"} Apr 20 19:24:04.395779 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:04.395751 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-524jk_d9241b80-47a6-4cf4-8485-01b585082093/cluster-monitoring-operator/0.log" Apr 20 19:24:05.406765 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:05.406720 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-v8rqf_060b5a16-fbe2-4e36-b398-85d8ac1178a9/monitoring-plugin/0.log" Apr 20 19:24:06.793194 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:06.793163 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vxd2g_4a07410c-4c3a-40a9-955d-2fc040fffc3a/init-textfile/0.log" Apr 20 19:24:06.993164 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:06.993139 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vxd2g_4a07410c-4c3a-40a9-955d-2fc040fffc3a/node-exporter/0.log" Apr 20 19:24:07.195307 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:07.195236 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vxd2g_4a07410c-4c3a-40a9-955d-2fc040fffc3a/kube-rbac-proxy/0.log" Apr 20 19:24:07.993069 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:07.993040 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aba5cf0d-edd0-466d-a46b-89dbd4806065/init-config-reloader/0.log" Apr 20 19:24:08.194633 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:08.194603 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aba5cf0d-edd0-466d-a46b-89dbd4806065/prometheus/0.log" Apr 20 19:24:08.393710 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:08.393637 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aba5cf0d-edd0-466d-a46b-89dbd4806065/config-reloader/0.log" Apr 20 19:24:08.593831 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:08.593806 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aba5cf0d-edd0-466d-a46b-89dbd4806065/thanos-sidecar/0.log" Apr 20 19:24:08.793539 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:08.793512 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aba5cf0d-edd0-466d-a46b-89dbd4806065/kube-rbac-proxy-web/0.log" Apr 20 19:24:08.993148 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:08.993122 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aba5cf0d-edd0-466d-a46b-89dbd4806065/kube-rbac-proxy/0.log" Apr 20 19:24:09.192534 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:09.192465 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_aba5cf0d-edd0-466d-a46b-89dbd4806065/kube-rbac-proxy-thanos/0.log" Apr 20 19:24:12.596670 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:12.596642 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7cqnx_4a717388-605c-4d9d-8381-4bbf7fe371fb/serve-healthcheck-canary/0.log" Apr 20 19:24:16.470824 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:16.470753 2577 generic.go:358] "Generic (PLEG): container finished" podID="1a448f9a-4a4e-4017-8ec1-117e5a1efe2d" containerID="61986ea3059a5559304d262b52937e692822bdbe6303294a7b41ee1ed1ba47b6" exitCode=0 Apr 20 19:24:16.471146 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:16.470826 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" event={"ID":"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d","Type":"ContainerDied","Data":"61986ea3059a5559304d262b52937e692822bdbe6303294a7b41ee1ed1ba47b6"} Apr 20 19:24:16.471146 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:16.471075 2577 scope.go:117] "RemoveContainer" containerID="61986ea3059a5559304d262b52937e692822bdbe6303294a7b41ee1ed1ba47b6" Apr 20 19:24:17.475364 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:17.475326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2sdjt" event={"ID":"1a448f9a-4a4e-4017-8ec1-117e5a1efe2d","Type":"ContainerStarted","Data":"b3d4268278f380460b2cd72508e4802ee5ab18b0310b25252305a1b6f39b8d9b"} Apr 20 19:24:26.506008 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:26.505976 2577 generic.go:358] "Generic (PLEG): container finished" podID="c19d268b-1a81-44d2-9b22-adc4e7ec01d0" containerID="11959146459fe01c43155614292cb5eb74d116f233129e310c0c4f142d24af82" exitCode=0 Apr 20 19:24:26.506393 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:26.506017 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" event={"ID":"c19d268b-1a81-44d2-9b22-adc4e7ec01d0","Type":"ContainerDied","Data":"11959146459fe01c43155614292cb5eb74d116f233129e310c0c4f142d24af82"} Apr 20 19:24:26.506393 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:26.506319 2577 scope.go:117] "RemoveContainer" containerID="11959146459fe01c43155614292cb5eb74d116f233129e310c0c4f142d24af82" Apr 20 19:24:27.509773 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:27.509717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-t96hc" event={"ID":"c19d268b-1a81-44d2-9b22-adc4e7ec01d0","Type":"ContainerStarted","Data":"3cbc09023e665326dbe01d8cb0e875b9c8948914454e83cce11a37c02fe4a6b1"} Apr 20 19:24:33.711829 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:33.711792 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:33.729950 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:33.729927 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:34.546549 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:34.546522 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:47.590710 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:47.590635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:24:47.593255 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:47.593235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39c06111-8b7a-4d9f-a3de-f5c655ac387d-metrics-certs\") pod \"network-metrics-daemon-tssws\" (UID: \"39c06111-8b7a-4d9f-a3de-f5c655ac387d\") " pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:24:47.734873 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:47.734850 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gxr6p\"" Apr 20 19:24:47.742662 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:47.742648 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tssws" Apr 20 19:24:47.858675 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:47.858598 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tssws"] Apr 20 19:24:47.862404 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:24:47.862367 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c06111_8b7a_4d9f_a3de_f5c655ac387d.slice/crio-70838adde01a9ee63f3d204f4163ccc5d90c33dd40a546f314e4069e216a9f23 WatchSource:0}: Error finding container 70838adde01a9ee63f3d204f4163ccc5d90c33dd40a546f314e4069e216a9f23: Status 404 returned error can't find the container with id 70838adde01a9ee63f3d204f4163ccc5d90c33dd40a546f314e4069e216a9f23 Apr 20 19:24:48.569691 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:48.569653 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tssws" event={"ID":"39c06111-8b7a-4d9f-a3de-f5c655ac387d","Type":"ContainerStarted","Data":"70838adde01a9ee63f3d204f4163ccc5d90c33dd40a546f314e4069e216a9f23"} Apr 20 19:24:49.574389 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:49.574346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tssws" event={"ID":"39c06111-8b7a-4d9f-a3de-f5c655ac387d","Type":"ContainerStarted","Data":"ea74b57b5cef98a7794da8289ea5d2621310a42510c5191884155e0fba5e9bc4"} Apr 20 19:24:49.574389 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:49.574390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tssws" event={"ID":"39c06111-8b7a-4d9f-a3de-f5c655ac387d","Type":"ContainerStarted","Data":"19b8223eb8fc49e24b2e213cb3013037778633eebb84e506e9916b9f3b301e52"} Apr 20 19:24:49.589518 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:49.589479 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tssws" podStartSLOduration=253.631455677 podStartE2EDuration="4m14.589467145s" podCreationTimestamp="2026-04-20 19:20:35 +0000 UTC" firstStartedPulling="2026-04-20 19:24:47.864473595 +0000 UTC m=+252.578845664" lastFinishedPulling="2026-04-20 19:24:48.822485066 +0000 UTC m=+253.536857132" observedRunningTime="2026-04-20 19:24:49.589341241 +0000 UTC m=+254.303713328" watchObservedRunningTime="2026-04-20 19:24:49.589467145 +0000 UTC m=+254.303839232" Apr 20 19:24:51.743363 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:51.739801 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:24:51.743363 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:51.740599 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="prometheus" containerID="cri-o://40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" gracePeriod=600 Apr 20 19:24:51.743363 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:51.740839 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy-web" containerID="cri-o://416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" gracePeriod=600 Apr 20 19:24:51.743363 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:51.740854 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy" containerID="cri-o://3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" gracePeriod=600 Apr 20 19:24:51.743363 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:51.740933 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="thanos-sidecar" containerID="cri-o://02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" gracePeriod=600 Apr 20 19:24:51.743363 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:51.741004 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="config-reloader" containerID="cri-o://20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" gracePeriod=600 Apr 20 19:24:51.743363 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:51.741009 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy-thanos" containerID="cri-o://61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" gracePeriod=600 Apr 20 19:24:51.975825 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:51.975796 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.027562 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027504 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-tls-assets\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027562 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027542 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-db\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027572 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-kubelet-serving-ca-bundle\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027590 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-metrics-client-ca\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027608 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-web-config\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027746 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-grpc-tls\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027979 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027818 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkzzm\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-kube-api-access-zkzzm\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027979 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027851 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-config\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027979 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027900 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-tls\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027979 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027932 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-serving-certs-ca-bundle\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.027979 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027958 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-trusted-ca-bundle\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.028220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027992 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.028220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.028021 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-kube-rbac-proxy\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.028220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.028048 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.028220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.028077 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-rulefiles-0\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.028220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.028125 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-config-out\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.028220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.028170 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-metrics-client-certs\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.028220 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.028196 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-thanos-prometheus-http-client-file\") pod \"aba5cf0d-edd0-466d-a46b-89dbd4806065\" (UID: \"aba5cf0d-edd0-466d-a46b-89dbd4806065\") " Apr 20 19:24:52.031833 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027984 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:52.031833 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.027992 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:52.031833 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.028612 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:24:52.031833 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.030336 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:52.031833 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.030979 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.031833 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.031208 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:24:52.031833 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.031784 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:52.032282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.031906 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:52.032282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.031971 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-config" (OuterVolumeSpecName: "config") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.032471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.032442 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-kube-api-access-zkzzm" (OuterVolumeSpecName: "kube-api-access-zkzzm") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "kube-api-access-zkzzm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:24:52.032827 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.032781 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.033054 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.033018 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.033149 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.033065 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.033149 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.033113 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.034032 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.034007 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.034119 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.034030 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-config-out" (OuterVolumeSpecName: "config-out") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:24:52.034119 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.034082 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.041508 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.041489 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-web-config" (OuterVolumeSpecName: "web-config") pod "aba5cf0d-edd0-466d-a46b-89dbd4806065" (UID: "aba5cf0d-edd0-466d-a46b-89dbd4806065"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:52.130147 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130111 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-metrics-client-certs\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130147 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130145 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130147 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130156 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-tls-assets\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130165 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-db\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130185 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130194 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-metrics-client-ca\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130203 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-web-config\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130212 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-grpc-tls\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130221 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkzzm\" (UniqueName: \"kubernetes.io/projected/aba5cf0d-edd0-466d-a46b-89dbd4806065-kube-api-access-zkzzm\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130229 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-config\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130237 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130246 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130255 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130265 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130274 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-kube-rbac-proxy\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130283 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/aba5cf0d-edd0-466d-a46b-89dbd4806065-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130292 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/aba5cf0d-edd0-466d-a46b-89dbd4806065-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.130304 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.130301 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aba5cf0d-edd0-466d-a46b-89dbd4806065-config-out\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:24:52.586252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586219 2577 generic.go:358] "Generic (PLEG): container finished" podID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" exitCode=0 Apr 20 19:24:52.586252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586243 2577 generic.go:358] "Generic (PLEG): container finished" podID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" exitCode=0 Apr 20 19:24:52.586252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586250 2577 generic.go:358] "Generic (PLEG): container finished" podID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" exitCode=0 Apr 20 19:24:52.586252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586255 2577 generic.go:358] "Generic (PLEG): container finished" podID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" exitCode=0 Apr 20 19:24:52.586252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586261 2577 generic.go:358] "Generic (PLEG): container finished" podID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" exitCode=0 Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586266 2577 generic.go:358] "Generic (PLEG): container finished" podID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" exitCode=0 Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerDied","Data":"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392"} Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586322 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerDied","Data":"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870"} Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerDied","Data":"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6"} Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerDied","Data":"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50"} Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerDied","Data":"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362"} Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586381 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerDied","Data":"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552"} Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"aba5cf0d-edd0-466d-a46b-89dbd4806065","Type":"ContainerDied","Data":"607359f1a851b6f875a1cd0a4f846f89e783a2ae9db09f3e6b9bf085040ad35e"} Apr 20 19:24:52.586556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.586403 2577 scope.go:117] "RemoveContainer" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" Apr 20 19:24:52.595714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.595667 2577 scope.go:117] "RemoveContainer" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" Apr 20 19:24:52.602734 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.602700 2577 scope.go:117] "RemoveContainer" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" Apr 20 19:24:52.608665 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.608651 2577 scope.go:117] "RemoveContainer" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" Apr 20 19:24:52.614127 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.614107 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:24:52.615192 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.615178 2577 scope.go:117] "RemoveContainer" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" Apr 20 19:24:52.619569 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.619546 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:24:52.621919 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.621905 2577 scope.go:117] "RemoveContainer" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" Apr 20 19:24:52.627986 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.627969 2577 scope.go:117] "RemoveContainer" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" Apr 20 19:24:52.633635 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.633618 2577 scope.go:117] "RemoveContainer" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" Apr 20 19:24:52.633888 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:24:52.633870 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": container with ID starting with 61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392 not found: ID does not exist" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" Apr 20 19:24:52.633943 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.633900 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392"} err="failed to get container status \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": rpc error: code = NotFound desc = could not find container \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": container with ID starting with 61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392 not found: ID does not exist" Apr 20 19:24:52.633943 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.633931 2577 scope.go:117] "RemoveContainer" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" Apr 20 19:24:52.634167 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:24:52.634151 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": container with ID starting with 3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870 not found: ID does not exist" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" Apr 20 19:24:52.634219 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.634173 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870"} err="failed to get container status \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": rpc error: code = NotFound desc = could not find container \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": container with ID starting with 3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870 not found: ID does not exist" Apr 20 19:24:52.634219 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.634190 2577 scope.go:117] "RemoveContainer" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" Apr 20 19:24:52.634394 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:24:52.634378 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": container with ID starting with 416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6 not found: ID does not exist" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" Apr 20 19:24:52.634438 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.634399 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6"} err="failed to get container status \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": rpc error: code = NotFound desc = could not find container \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": container with ID starting with 416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6 not found: ID does not exist" Apr 20 19:24:52.634438 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.634412 2577 scope.go:117] "RemoveContainer" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" Apr 20 19:24:52.634630 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:24:52.634613 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": container with ID starting with 02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50 not found: ID does not exist" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" Apr 20 19:24:52.634667 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.634638 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50"} err="failed to get container status \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": rpc error: code = NotFound desc = could not find container \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": container with ID starting with 02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50 not found: ID does not exist" Apr 20 19:24:52.634667 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.634652 2577 scope.go:117] "RemoveContainer" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" Apr 20 19:24:52.634940 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:24:52.634921 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": container with ID starting with 20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362 not found: ID does not exist" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" Apr 20 19:24:52.634991 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.634943 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362"} err="failed to get container status \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": rpc error: code = NotFound desc = could not find container \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": container with ID starting with 20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362 not found: ID does not exist" Apr 20 19:24:52.634991 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.634954 2577 scope.go:117] "RemoveContainer" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" Apr 20 19:24:52.635162 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:24:52.635146 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": container with ID starting with 40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552 not found: ID does not exist" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" Apr 20 19:24:52.635204 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635166 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552"} err="failed to get container status \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": rpc error: code = NotFound desc = could not find container \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": container with ID starting with 40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552 not found: ID does not exist" Apr 20 19:24:52.635204 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635181 2577 scope.go:117] "RemoveContainer" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" Apr 20 19:24:52.635384 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:24:52.635367 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": container with ID starting with f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43 not found: ID does not exist" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" Apr 20 19:24:52.635422 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635388 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43"} err="failed to get container status \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": rpc error: code = NotFound desc = could not find container \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": container with ID starting with f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43 not found: ID does not exist" Apr 20 19:24:52.635422 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635401 2577 scope.go:117] "RemoveContainer" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" Apr 20 19:24:52.635597 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635581 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392"} err="failed to get container status \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": rpc error: code = NotFound desc = could not find container \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": container with ID starting with 61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392 not found: ID does not exist" Apr 20 19:24:52.635640 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635597 2577 scope.go:117] "RemoveContainer" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" Apr 20 19:24:52.635813 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635797 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870"} err="failed to get container status \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": rpc error: code = NotFound desc = could not find container \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": container with ID starting with 3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870 not found: ID does not exist" Apr 20 19:24:52.635861 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635813 2577 scope.go:117] "RemoveContainer" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" Apr 20 19:24:52.635988 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635973 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6"} err="failed to get container status \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": rpc error: code = NotFound desc = could not find container \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": container with ID starting with 416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6 not found: ID does not exist" Apr 20 19:24:52.636034 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.635989 2577 scope.go:117] "RemoveContainer" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" Apr 20 19:24:52.636158 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636142 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50"} err="failed to get container status \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": rpc error: code = NotFound desc = could not find container \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": container with ID starting with 02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50 not found: ID does not exist" Apr 20 19:24:52.636204 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636158 2577 scope.go:117] "RemoveContainer" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" Apr 20 19:24:52.636334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636320 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362"} err="failed to get container status \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": rpc error: code = NotFound desc = could not find container \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": container with ID starting with 20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362 not found: ID does not exist" Apr 20 19:24:52.636380 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636335 2577 scope.go:117] "RemoveContainer" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" Apr 20 19:24:52.636508 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636494 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552"} err="failed to get container status \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": rpc error: code = NotFound desc = could not find container \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": container with ID starting with 40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552 not found: ID does not exist" Apr 20 19:24:52.636551 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636508 2577 scope.go:117] "RemoveContainer" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" Apr 20 19:24:52.636669 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636653 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43"} err="failed to get container status \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": rpc error: code = NotFound desc = could not find container \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": container with ID starting with f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43 not found: ID does not exist" Apr 20 19:24:52.636669 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636669 2577 scope.go:117] "RemoveContainer" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" Apr 20 19:24:52.636839 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636823 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392"} err="failed to get container status \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": rpc error: code = NotFound desc = could not find container \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": container with ID starting with 61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392 not found: ID does not exist" Apr 20 19:24:52.636889 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636840 2577 scope.go:117] "RemoveContainer" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" Apr 20 19:24:52.637008 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.636994 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870"} err="failed to get container status \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": rpc error: code = NotFound desc = could not find container \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": container with ID starting with 3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870 not found: ID does not exist" Apr 20 19:24:52.637053 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637008 2577 scope.go:117] "RemoveContainer" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" Apr 20 19:24:52.637181 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637166 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6"} err="failed to get container status \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": rpc error: code = NotFound desc = could not find container \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": container with ID starting with 416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6 not found: ID does not exist" Apr 20 19:24:52.637181 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637180 2577 scope.go:117] "RemoveContainer" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" Apr 20 19:24:52.637392 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637372 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50"} err="failed to get container status \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": rpc error: code = NotFound desc = could not find container \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": container with ID starting with 02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50 not found: ID does not exist" Apr 20 19:24:52.637392 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637392 2577 scope.go:117] "RemoveContainer" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" Apr 20 19:24:52.637618 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637592 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362"} err="failed to get container status \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": rpc error: code = NotFound desc = could not find container \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": container with ID starting with 20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362 not found: ID does not exist" Apr 20 19:24:52.637707 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637620 2577 scope.go:117] "RemoveContainer" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" Apr 20 19:24:52.637929 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637910 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552"} err="failed to get container status \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": rpc error: code = NotFound desc = could not find container \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": container with ID starting with 40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552 not found: ID does not exist" Apr 20 19:24:52.638003 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.637929 2577 scope.go:117] "RemoveContainer" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" Apr 20 19:24:52.638120 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638102 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43"} err="failed to get container status \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": rpc error: code = NotFound desc = could not find container \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": container with ID starting with f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43 not found: ID does not exist" Apr 20 19:24:52.638174 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638123 2577 scope.go:117] "RemoveContainer" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" Apr 20 19:24:52.638267 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638250 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392"} err="failed to get container status \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": rpc error: code = NotFound desc = could not find container \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": container with ID starting with 61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392 not found: ID does not exist" Apr 20 19:24:52.638267 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638267 2577 scope.go:117] "RemoveContainer" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" Apr 20 19:24:52.638489 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638472 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870"} err="failed to get container status \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": rpc error: code = NotFound desc = could not find container \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": container with ID starting with 3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870 not found: ID does not exist" Apr 20 19:24:52.638552 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638490 2577 scope.go:117] "RemoveContainer" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" Apr 20 19:24:52.638693 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638676 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6"} err="failed to get container status \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": rpc error: code = NotFound desc = could not find container \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": container with ID starting with 416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6 not found: ID does not exist" Apr 20 19:24:52.638758 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638692 2577 scope.go:117] "RemoveContainer" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" Apr 20 19:24:52.638878 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638863 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50"} err="failed to get container status \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": rpc error: code = NotFound desc = could not find container \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": container with ID starting with 02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50 not found: ID does not exist" Apr 20 19:24:52.638924 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.638877 2577 scope.go:117] "RemoveContainer" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" Apr 20 19:24:52.639037 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639023 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362"} err="failed to get container status \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": rpc error: code = NotFound desc = could not find container \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": container with ID starting with 20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362 not found: ID does not exist" Apr 20 19:24:52.639081 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639037 2577 scope.go:117] "RemoveContainer" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" Apr 20 19:24:52.639216 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639196 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552"} err="failed to get container status \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": rpc error: code = NotFound desc = could not find container \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": container with ID starting with 40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552 not found: ID does not exist" Apr 20 19:24:52.639281 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639217 2577 scope.go:117] "RemoveContainer" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" Apr 20 19:24:52.639404 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639382 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43"} err="failed to get container status \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": rpc error: code = NotFound desc = could not find container \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": container with ID starting with f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43 not found: ID does not exist" Apr 20 19:24:52.639457 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639407 2577 scope.go:117] "RemoveContainer" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" Apr 20 19:24:52.639575 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639558 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392"} err="failed to get container status \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": rpc error: code = NotFound desc = could not find container \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": container with ID starting with 61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392 not found: ID does not exist" Apr 20 19:24:52.639618 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639575 2577 scope.go:117] "RemoveContainer" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" Apr 20 19:24:52.639788 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639772 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870"} err="failed to get container status \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": rpc error: code = NotFound desc = could not find container \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": container with ID starting with 3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870 not found: ID does not exist" Apr 20 19:24:52.639859 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639790 2577 scope.go:117] "RemoveContainer" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" Apr 20 19:24:52.639991 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639972 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6"} err="failed to get container status \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": rpc error: code = NotFound desc = could not find container \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": container with ID starting with 416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6 not found: ID does not exist" Apr 20 19:24:52.640033 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.639992 2577 scope.go:117] "RemoveContainer" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" Apr 20 19:24:52.640184 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640169 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50"} err="failed to get container status \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": rpc error: code = NotFound desc = could not find container \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": container with ID starting with 02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50 not found: ID does not exist" Apr 20 19:24:52.640184 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640184 2577 scope.go:117] "RemoveContainer" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" Apr 20 19:24:52.640357 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640341 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362"} err="failed to get container status \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": rpc error: code = NotFound desc = could not find container \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": container with ID starting with 20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362 not found: ID does not exist" Apr 20 19:24:52.640421 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640359 2577 scope.go:117] "RemoveContainer" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" Apr 20 19:24:52.640554 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640536 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552"} err="failed to get container status \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": rpc error: code = NotFound desc = could not find container \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": container with ID starting with 40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552 not found: ID does not exist" Apr 20 19:24:52.640597 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640554 2577 scope.go:117] "RemoveContainer" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" Apr 20 19:24:52.640718 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640704 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43"} err="failed to get container status \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": rpc error: code = NotFound desc = could not find container \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": container with ID starting with f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43 not found: ID does not exist" Apr 20 19:24:52.640780 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640718 2577 scope.go:117] "RemoveContainer" containerID="61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392" Apr 20 19:24:52.640904 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640889 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392"} err="failed to get container status \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": rpc error: code = NotFound desc = could not find container \"61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392\": container with ID starting with 61d8f8910c30b6d2454e73b37c7b40cef8d5dce76a8e1abf9f9c80482a25c392 not found: ID does not exist" Apr 20 19:24:52.640949 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.640904 2577 scope.go:117] "RemoveContainer" containerID="3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870" Apr 20 19:24:52.641074 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641056 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870"} err="failed to get container status \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": rpc error: code = NotFound desc = could not find container \"3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870\": container with ID starting with 3cab0fcd579b60dc99a482e931e4b9c94f2f7a3f9aecc556b2f7387bfa43b870 not found: ID does not exist" Apr 20 19:24:52.641133 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641076 2577 scope.go:117] "RemoveContainer" containerID="416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6" Apr 20 19:24:52.641273 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641257 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6"} err="failed to get container status \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": rpc error: code = NotFound desc = could not find container \"416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6\": container with ID starting with 416347f23b2ac1377e7e5d8bbf627dcbfc9a5e478dd5d45aeae5cea41e966bb6 not found: ID does not exist" Apr 20 19:24:52.641312 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641273 2577 scope.go:117] "RemoveContainer" containerID="02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50" Apr 20 19:24:52.641464 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641447 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50"} err="failed to get container status \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": rpc error: code = NotFound desc = could not find container \"02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50\": container with ID starting with 02a2275af93eada15472d9b47de5cf51221eb46eba7bc6318e9b45c382200a50 not found: ID does not exist" Apr 20 19:24:52.641514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641466 2577 scope.go:117] "RemoveContainer" containerID="20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362" Apr 20 19:24:52.641676 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641659 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362"} err="failed to get container status \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": rpc error: code = NotFound desc = could not find container \"20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362\": container with ID starting with 20bc23712784e744b3d7dc0947277d77f96510c83e2d723042db97ee88a69362 not found: ID does not exist" Apr 20 19:24:52.641714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641678 2577 scope.go:117] "RemoveContainer" containerID="40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552" Apr 20 19:24:52.641827 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641813 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552"} err="failed to get container status \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": rpc error: code = NotFound desc = could not find container \"40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552\": container with ID starting with 40c3406ffaa1dfed8a2553fb5c9eaca4cf0b8d56879236e4de05417c16abf552 not found: ID does not exist" Apr 20 19:24:52.641865 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641827 2577 scope.go:117] "RemoveContainer" containerID="f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43" Apr 20 19:24:52.642011 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.641995 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43"} err="failed to get container status \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": rpc error: code = NotFound desc = could not find container \"f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43\": container with ID starting with f4ed00fdb793e48d3d6d0e820fc1a32c6bf7c3a34e6a88b040ba7e92db66db43 not found: ID does not exist" Apr 20 19:24:52.647055 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647038 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:24:52.647279 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647268 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="init-config-reloader" Apr 20 19:24:52.647321 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647281 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="init-config-reloader" Apr 20 19:24:52.647321 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647288 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="config-reloader" Apr 20 19:24:52.647321 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647294 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="config-reloader" Apr 20 19:24:52.647321 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647309 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy" Apr 20 19:24:52.647321 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647314 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy" Apr 20 19:24:52.647321 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647320 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy-thanos" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647325 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy-thanos" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647335 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="thanos-sidecar" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647341 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="thanos-sidecar" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647347 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="prometheus" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647352 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="prometheus" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647360 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy-web" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647365 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy-web" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647403 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="thanos-sidecar" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647411 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="prometheus" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647417 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy-web" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647425 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647431 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="config-reloader" Apr 20 19:24:52.647483 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.647436 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" containerName="kube-rbac-proxy-thanos" Apr 20 19:24:52.652589 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.652573 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.655228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.655214 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 19:24:52.656676 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.656657 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:24:52.656791 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.656675 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 19:24:52.656791 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.656692 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 19:24:52.656898 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.656800 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 19:24:52.656939 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.656925 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 19:24:52.657039 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.657017 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 19:24:52.657126 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.657103 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-71uca88g07p96\"" Apr 20 19:24:52.657182 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.657141 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 19:24:52.657501 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.657488 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 19:24:52.657627 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.657607 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 19:24:52.659492 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.659475 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-652dp\"" Apr 20 19:24:52.660114 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.660097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 19:24:52.661702 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.661682 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 19:24:52.664780 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.664761 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 19:24:52.679969 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.679935 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:24:52.732932 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.732912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733036 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.732952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733036 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.732977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733036 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733142 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733142 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733142 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl28l\" (UniqueName: \"kubernetes.io/projected/7a33ca71-8d9f-45c6-bb56-aab488691412-kube-api-access-tl28l\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733253 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733253 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a33ca71-8d9f-45c6-bb56-aab488691412-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733253 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733193 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733253 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733253 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733445 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a33ca71-8d9f-45c6-bb56-aab488691412-config-out\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733445 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733445 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-config\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733445 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-web-config\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733445 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733361 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.733445 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.733396 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.833970 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.833947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.834282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.833974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.834282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.833996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a33ca71-8d9f-45c6-bb56-aab488691412-config-out\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.834282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.834011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.834282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.834027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-config\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.834282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.834047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-web-config\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.834282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.834083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.834282 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.834109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835050 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.834944 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835050 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.834995 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835050 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl28l\" (UniqueName: \"kubernetes.io/projected/7a33ca71-8d9f-45c6-bb56-aab488691412-kube-api-access-tl28l\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a33ca71-8d9f-45c6-bb56-aab488691412-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.835803 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.835774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.836597 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.836540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.837384 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.836833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.837384 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.837195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.837384 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.837308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.837384 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.837317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.837802 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.837708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-web-config\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.837886 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.837841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-config\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.837939 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.837907 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7a33ca71-8d9f-45c6-bb56-aab488691412-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.838132 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.838110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a33ca71-8d9f-45c6-bb56-aab488691412-config-out\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.838803 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.838781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.839047 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.839026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.839246 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.839228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a33ca71-8d9f-45c6-bb56-aab488691412-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.839809 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.839791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.839918 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.839902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.839981 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.839934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7a33ca71-8d9f-45c6-bb56-aab488691412-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.843772 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.843711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl28l\" (UniqueName: \"kubernetes.io/projected/7a33ca71-8d9f-45c6-bb56-aab488691412-kube-api-access-tl28l\") pod \"prometheus-k8s-0\" (UID: \"7a33ca71-8d9f-45c6-bb56-aab488691412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:52.961586 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:52.961564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:24:53.084335 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:53.084312 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:24:53.086543 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:24:53.086515 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a33ca71_8d9f_45c6_bb56_aab488691412.slice/crio-800d0eb325f5fb1767710220039284a30c0c35cf2b5a89a9ae1449e1b56e0a70 WatchSource:0}: Error finding container 800d0eb325f5fb1767710220039284a30c0c35cf2b5a89a9ae1449e1b56e0a70: Status 404 returned error can't find the container with id 800d0eb325f5fb1767710220039284a30c0c35cf2b5a89a9ae1449e1b56e0a70 Apr 20 19:24:53.590862 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:53.590828 2577 generic.go:358] "Generic (PLEG): container finished" podID="7a33ca71-8d9f-45c6-bb56-aab488691412" containerID="93268bea7278d4384bfd944d69cb2a63380a6793054a9d5c15e704adb7d77031" exitCode=0 Apr 20 19:24:53.591048 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:53.590912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7a33ca71-8d9f-45c6-bb56-aab488691412","Type":"ContainerDied","Data":"93268bea7278d4384bfd944d69cb2a63380a6793054a9d5c15e704adb7d77031"} Apr 20 19:24:53.591048 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:53.590951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7a33ca71-8d9f-45c6-bb56-aab488691412","Type":"ContainerStarted","Data":"800d0eb325f5fb1767710220039284a30c0c35cf2b5a89a9ae1449e1b56e0a70"} Apr 20 19:24:53.837547 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:53.837512 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba5cf0d-edd0-466d-a46b-89dbd4806065" path="/var/lib/kubelet/pods/aba5cf0d-edd0-466d-a46b-89dbd4806065/volumes" Apr 20 19:24:54.597932 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:54.597893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7a33ca71-8d9f-45c6-bb56-aab488691412","Type":"ContainerStarted","Data":"77dabd305ae41334921c647a6fd0aab8f200f60eb0f2e3dc499d47d6c72a1553"} Apr 20 19:24:54.597932 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:54.597932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7a33ca71-8d9f-45c6-bb56-aab488691412","Type":"ContainerStarted","Data":"81414a4d813cbe1b2ffffee2d53791111ac30520b254ebe67efe77750b24f8c0"} Apr 20 19:24:54.598153 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:54.597945 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7a33ca71-8d9f-45c6-bb56-aab488691412","Type":"ContainerStarted","Data":"a89921c15409edf2d1578664f93d4ddaba1999affff7df65075a40876b48104b"} Apr 20 19:24:54.598153 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:54.597956 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7a33ca71-8d9f-45c6-bb56-aab488691412","Type":"ContainerStarted","Data":"71c31eb646ee0b6ac87ce2f91a4ac85e9715759f12a6fb2f18ac9d28faf66745"} Apr 20 19:24:54.598153 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:54.597969 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7a33ca71-8d9f-45c6-bb56-aab488691412","Type":"ContainerStarted","Data":"fbc38900885e11d7f442e3fe11991135c64b6db17509f7831385a29928336fdf"} Apr 20 19:24:54.598153 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:54.597979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7a33ca71-8d9f-45c6-bb56-aab488691412","Type":"ContainerStarted","Data":"4aba2a13d930e95a2ef903dc0b71d17788e0f81cb3339c63fbce9db8a83fdfcc"} Apr 20 19:24:54.624774 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:54.624711 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.624695892 podStartE2EDuration="2.624695892s" podCreationTimestamp="2026-04-20 19:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:24:54.622748218 +0000 UTC m=+259.337120303" watchObservedRunningTime="2026-04-20 19:24:54.624695892 +0000 UTC m=+259.339067982" Apr 20 19:24:57.962301 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:24:57.962266 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:25:35.752367 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:25:35.752343 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:25:35.752877 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:25:35.752859 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:25:35.756873 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:25:35.756855 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 19:25:52.962027 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:25:52.961993 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:25:52.977495 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:25:52.977473 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:25:53.779475 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:25:53.779450 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:28:21.304755 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.304710 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq"] Apr 20 19:28:21.307914 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.307895 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.310481 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.310462 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:28:21.310666 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.310652 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:28:21.311046 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.311028 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:28:21.311140 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.311093 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:28:21.311318 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.311302 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pkszx\"" Apr 20 19:28:21.326830 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.326810 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq"] Apr 20 19:28:21.418439 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.418411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9227d6c3-5736-4590-b1af-eaf1d30a0b56-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.418576 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.418463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9227d6c3-5736-4590-b1af-eaf1d30a0b56-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.418576 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.418491 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s752g\" (UniqueName: \"kubernetes.io/projected/9227d6c3-5736-4590-b1af-eaf1d30a0b56-kube-api-access-s752g\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.518906 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.518871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9227d6c3-5736-4590-b1af-eaf1d30a0b56-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.519055 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.519026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9227d6c3-5736-4590-b1af-eaf1d30a0b56-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.519116 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.519062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s752g\" (UniqueName: \"kubernetes.io/projected/9227d6c3-5736-4590-b1af-eaf1d30a0b56-kube-api-access-s752g\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.521374 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.521348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9227d6c3-5736-4590-b1af-eaf1d30a0b56-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.521517 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.521388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9227d6c3-5736-4590-b1af-eaf1d30a0b56-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.529439 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.529415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s752g\" (UniqueName: \"kubernetes.io/projected/9227d6c3-5736-4590-b1af-eaf1d30a0b56-kube-api-access-s752g\") pod \"opendatahub-operator-controller-manager-7875d57869-6x8vq\" (UID: \"9227d6c3-5736-4590-b1af-eaf1d30a0b56\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.617878 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.617824 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:21.736050 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.736027 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq"] Apr 20 19:28:21.738708 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:28:21.738670 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9227d6c3_5736_4590_b1af_eaf1d30a0b56.slice/crio-a076c82b38c6e0d00e5e2c69186648b9f1c53378cb21c6d128e2edd25cca44ae WatchSource:0}: Error finding container a076c82b38c6e0d00e5e2c69186648b9f1c53378cb21c6d128e2edd25cca44ae: Status 404 returned error can't find the container with id a076c82b38c6e0d00e5e2c69186648b9f1c53378cb21c6d128e2edd25cca44ae Apr 20 19:28:21.740288 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:21.740271 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:28:22.154240 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:22.154209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" event={"ID":"9227d6c3-5736-4590-b1af-eaf1d30a0b56","Type":"ContainerStarted","Data":"a076c82b38c6e0d00e5e2c69186648b9f1c53378cb21c6d128e2edd25cca44ae"} Apr 20 19:28:25.164557 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:25.164525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" event={"ID":"9227d6c3-5736-4590-b1af-eaf1d30a0b56","Type":"ContainerStarted","Data":"ffa63a2d56e0bf665c06329631bedc46b8a8e623361f05d917c926833b4c755e"} Apr 20 19:28:25.164931 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:25.164648 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:25.187307 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:25.187265 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" podStartSLOduration=1.646218409 podStartE2EDuration="4.187250402s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:21.74039282 +0000 UTC m=+466.454764886" lastFinishedPulling="2026-04-20 19:28:24.281424798 +0000 UTC m=+468.995796879" observedRunningTime="2026-04-20 19:28:25.18614508 +0000 UTC m=+469.900517168" watchObservedRunningTime="2026-04-20 19:28:25.187250402 +0000 UTC m=+469.901622542" Apr 20 19:28:33.240849 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.240814 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld"] Apr 20 19:28:33.244128 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.244110 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.246224 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.246199 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 19:28:33.246410 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.246382 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:28:33.247181 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.247154 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:28:33.247181 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.247168 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5slkl\"" Apr 20 19:28:33.247338 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.247226 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 19:28:33.247338 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.247315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 19:28:33.256961 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.256940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld"] Apr 20 19:28:33.301697 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.301673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4973e77a-7c03-43d9-8a24-133776787912-cert\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.301805 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.301710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpg2p\" (UniqueName: \"kubernetes.io/projected/4973e77a-7c03-43d9-8a24-133776787912-kube-api-access-rpg2p\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.301805 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.301766 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4973e77a-7c03-43d9-8a24-133776787912-manager-config\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.301805 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.301799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4973e77a-7c03-43d9-8a24-133776787912-metrics-cert\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.402479 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.402451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4973e77a-7c03-43d9-8a24-133776787912-cert\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.402593 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.402485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpg2p\" (UniqueName: \"kubernetes.io/projected/4973e77a-7c03-43d9-8a24-133776787912-kube-api-access-rpg2p\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.402593 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.402518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4973e77a-7c03-43d9-8a24-133776787912-manager-config\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.402593 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.402555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4973e77a-7c03-43d9-8a24-133776787912-metrics-cert\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.403191 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.403169 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4973e77a-7c03-43d9-8a24-133776787912-manager-config\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.405034 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.405006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4973e77a-7c03-43d9-8a24-133776787912-cert\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.405127 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.405070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4973e77a-7c03-43d9-8a24-133776787912-metrics-cert\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.411158 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.411133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpg2p\" (UniqueName: \"kubernetes.io/projected/4973e77a-7c03-43d9-8a24-133776787912-kube-api-access-rpg2p\") pod \"lws-controller-manager-5c6db948fd-2h2ld\" (UID: \"4973e77a-7c03-43d9-8a24-133776787912\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.553980 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.553913 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:33.685081 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:33.685052 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld"] Apr 20 19:28:33.685759 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:28:33.685709 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4973e77a_7c03_43d9_8a24_133776787912.slice/crio-af1101c3a0555dbc89dfc922ed891bec71de4177927d1078c5a8a62b5a8c1b81 WatchSource:0}: Error finding container af1101c3a0555dbc89dfc922ed891bec71de4177927d1078c5a8a62b5a8c1b81: Status 404 returned error can't find the container with id af1101c3a0555dbc89dfc922ed891bec71de4177927d1078c5a8a62b5a8c1b81 Apr 20 19:28:34.193074 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:34.193042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" event={"ID":"4973e77a-7c03-43d9-8a24-133776787912","Type":"ContainerStarted","Data":"af1101c3a0555dbc89dfc922ed891bec71de4177927d1078c5a8a62b5a8c1b81"} Apr 20 19:28:36.169275 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:36.169252 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-6x8vq" Apr 20 19:28:37.205073 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:37.205029 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" event={"ID":"4973e77a-7c03-43d9-8a24-133776787912","Type":"ContainerStarted","Data":"4c36ad5a6b366c7761dfea3cdd18d72207beeffbf00089a388b86c9ce67392f9"} Apr 20 19:28:37.205434 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:37.205090 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:28:37.229025 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:37.228981 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" podStartSLOduration=1.7784618650000001 podStartE2EDuration="4.228969082s" podCreationTimestamp="2026-04-20 19:28:33 +0000 UTC" firstStartedPulling="2026-04-20 19:28:33.68762176 +0000 UTC m=+478.401993826" lastFinishedPulling="2026-04-20 19:28:36.138128977 +0000 UTC m=+480.852501043" observedRunningTime="2026-04-20 19:28:37.228964527 +0000 UTC m=+481.943336615" watchObservedRunningTime="2026-04-20 19:28:37.228969082 +0000 UTC m=+481.943341169" Apr 20 19:28:39.607923 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.607891 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs"] Apr 20 19:28:39.610948 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.610925 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.613784 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.613764 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 19:28:39.613901 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.613844 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:28:39.613901 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.613795 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:28:39.613901 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.613795 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-s5rpw\"" Apr 20 19:28:39.613901 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.613803 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 19:28:39.620177 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.620157 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs"] Apr 20 19:28:39.755870 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.755842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b61770f-53f3-445c-833a-79a6399688f0-tmp\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.756011 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.755881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b61770f-53f3-445c-833a-79a6399688f0-tls-certs\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.756011 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.755899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfv7h\" (UniqueName: \"kubernetes.io/projected/4b61770f-53f3-445c-833a-79a6399688f0-kube-api-access-sfv7h\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.856227 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.856203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b61770f-53f3-445c-833a-79a6399688f0-tmp\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.856370 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.856240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b61770f-53f3-445c-833a-79a6399688f0-tls-certs\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.856370 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.856258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfv7h\" (UniqueName: \"kubernetes.io/projected/4b61770f-53f3-445c-833a-79a6399688f0-kube-api-access-sfv7h\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.858407 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.858346 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b61770f-53f3-445c-833a-79a6399688f0-tmp\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.858573 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.858556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b61770f-53f3-445c-833a-79a6399688f0-tls-certs\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.863720 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.863698 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfv7h\" (UniqueName: \"kubernetes.io/projected/4b61770f-53f3-445c-833a-79a6399688f0-kube-api-access-sfv7h\") pod \"kube-auth-proxy-65b68d668c-pdqzs\" (UID: \"4b61770f-53f3-445c-833a-79a6399688f0\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:39.920602 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:39.920582 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" Apr 20 19:28:40.035379 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:40.035357 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs"] Apr 20 19:28:40.037888 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:28:40.037860 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b61770f_53f3_445c_833a_79a6399688f0.slice/crio-b9e1bbf28b4074aa05b509a6587c2b3617fe6ce6c4d420abd17f4852ecb9163f WatchSource:0}: Error finding container b9e1bbf28b4074aa05b509a6587c2b3617fe6ce6c4d420abd17f4852ecb9163f: Status 404 returned error can't find the container with id b9e1bbf28b4074aa05b509a6587c2b3617fe6ce6c4d420abd17f4852ecb9163f Apr 20 19:28:40.215365 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:40.215296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" event={"ID":"4b61770f-53f3-445c-833a-79a6399688f0","Type":"ContainerStarted","Data":"b9e1bbf28b4074aa05b509a6587c2b3617fe6ce6c4d420abd17f4852ecb9163f"} Apr 20 19:28:44.232141 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:44.232109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" event={"ID":"4b61770f-53f3-445c-833a-79a6399688f0","Type":"ContainerStarted","Data":"5965a257d53b63671251b26a87ad7cf24ecf455dd0d3a9d01f29b8624e2a9442"} Apr 20 19:28:44.246844 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:44.246800 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-65b68d668c-pdqzs" podStartSLOduration=1.772910881 podStartE2EDuration="5.246785778s" podCreationTimestamp="2026-04-20 19:28:39 +0000 UTC" firstStartedPulling="2026-04-20 19:28:40.039809942 +0000 UTC m=+484.754182009" lastFinishedPulling="2026-04-20 19:28:43.513684827 +0000 UTC m=+488.228056906" observedRunningTime="2026-04-20 19:28:44.245561096 +0000 UTC m=+488.959933184" watchObservedRunningTime="2026-04-20 19:28:44.246785778 +0000 UTC m=+488.961157870" Apr 20 19:28:48.210964 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:28:48.210936 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-2h2ld" Apr 20 19:30:27.887298 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:27.887266 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm"] Apr 20 19:30:27.890291 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:27.890274 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:27.892361 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:27.892332 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 19:30:27.892474 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:27.892369 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:30:27.892474 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:27.892376 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-j7x8z\"" Apr 20 19:30:27.893164 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:27.893141 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 19:30:27.893164 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:27.893156 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:30:27.900187 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:27.900170 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm"] Apr 20 19:30:28.054001 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.053973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/17ec7f02-9904-49e3-9f2e-554d1406846b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.054001 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.054003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bwg\" (UniqueName: \"kubernetes.io/projected/17ec7f02-9904-49e3-9f2e-554d1406846b-kube-api-access-c2bwg\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.054228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.054039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/17ec7f02-9904-49e3-9f2e-554d1406846b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.155341 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.155269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/17ec7f02-9904-49e3-9f2e-554d1406846b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.155341 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.155323 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/17ec7f02-9904-49e3-9f2e-554d1406846b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.155527 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.155350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bwg\" (UniqueName: \"kubernetes.io/projected/17ec7f02-9904-49e3-9f2e-554d1406846b-kube-api-access-c2bwg\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.156016 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.155996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/17ec7f02-9904-49e3-9f2e-554d1406846b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.157547 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.157529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/17ec7f02-9904-49e3-9f2e-554d1406846b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.163383 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.163357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bwg\" (UniqueName: \"kubernetes.io/projected/17ec7f02-9904-49e3-9f2e-554d1406846b-kube-api-access-c2bwg\") pod \"kuadrant-console-plugin-6cb54b5c86-92jsm\" (UID: \"17ec7f02-9904-49e3-9f2e-554d1406846b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.210859 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.210837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" Apr 20 19:30:28.324707 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.324622 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm"] Apr 20 19:30:28.327044 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:30:28.327017 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17ec7f02_9904_49e3_9f2e_554d1406846b.slice/crio-b6bda095eabfb91f86f96e832e91a2a15b77b3deebdaea75350309020ecbfe7b WatchSource:0}: Error finding container b6bda095eabfb91f86f96e832e91a2a15b77b3deebdaea75350309020ecbfe7b: Status 404 returned error can't find the container with id b6bda095eabfb91f86f96e832e91a2a15b77b3deebdaea75350309020ecbfe7b Apr 20 19:30:28.551320 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:28.551289 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" event={"ID":"17ec7f02-9904-49e3-9f2e-554d1406846b","Type":"ContainerStarted","Data":"b6bda095eabfb91f86f96e832e91a2a15b77b3deebdaea75350309020ecbfe7b"} Apr 20 19:30:35.776814 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:35.776782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:30:35.778265 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:35.778243 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:30:52.635931 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:52.635898 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" event={"ID":"17ec7f02-9904-49e3-9f2e-554d1406846b","Type":"ContainerStarted","Data":"07f4f22dfcb6f2ab4b3c812f48531c6c70c07ece70d4e36aa6d7b6f8e8d0c65e"} Apr 20 19:30:52.669204 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:30:52.669163 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-92jsm" podStartSLOduration=1.5150438 podStartE2EDuration="25.669149911s" podCreationTimestamp="2026-04-20 19:30:27 +0000 UTC" firstStartedPulling="2026-04-20 19:30:28.328835112 +0000 UTC m=+593.043207179" lastFinishedPulling="2026-04-20 19:30:52.482941224 +0000 UTC m=+617.197313290" observedRunningTime="2026-04-20 19:30:52.667817359 +0000 UTC m=+617.382189446" watchObservedRunningTime="2026-04-20 19:30:52.669149911 +0000 UTC m=+617.383521998" Apr 20 19:31:19.875935 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:19.875898 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kjknv"] Apr 20 19:31:20.123515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.123483 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kjknv"] Apr 20 19:31:20.123515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.123514 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kjknv"] Apr 20 19:31:20.123709 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.123620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:20.125846 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.125823 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 19:31:20.265112 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.265080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/711b8f4a-7068-4513-93d2-275b7aff3ddf-config-file\") pod \"limitador-limitador-7d549b5b-kjknv\" (UID: \"711b8f4a-7068-4513-93d2-275b7aff3ddf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:20.265112 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.265114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqtz6\" (UniqueName: \"kubernetes.io/projected/711b8f4a-7068-4513-93d2-275b7aff3ddf-kube-api-access-nqtz6\") pod \"limitador-limitador-7d549b5b-kjknv\" (UID: \"711b8f4a-7068-4513-93d2-275b7aff3ddf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:20.365514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.365481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/711b8f4a-7068-4513-93d2-275b7aff3ddf-config-file\") pod \"limitador-limitador-7d549b5b-kjknv\" (UID: \"711b8f4a-7068-4513-93d2-275b7aff3ddf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:20.365514 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.365515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqtz6\" (UniqueName: \"kubernetes.io/projected/711b8f4a-7068-4513-93d2-275b7aff3ddf-kube-api-access-nqtz6\") pod \"limitador-limitador-7d549b5b-kjknv\" (UID: \"711b8f4a-7068-4513-93d2-275b7aff3ddf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:20.366089 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.366070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/711b8f4a-7068-4513-93d2-275b7aff3ddf-config-file\") pod \"limitador-limitador-7d549b5b-kjknv\" (UID: \"711b8f4a-7068-4513-93d2-275b7aff3ddf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:20.373256 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.373231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqtz6\" (UniqueName: \"kubernetes.io/projected/711b8f4a-7068-4513-93d2-275b7aff3ddf-kube-api-access-nqtz6\") pod \"limitador-limitador-7d549b5b-kjknv\" (UID: \"711b8f4a-7068-4513-93d2-275b7aff3ddf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:20.434087 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.434030 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:20.546957 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.546930 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kjknv"] Apr 20 19:31:20.549254 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:31:20.549218 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711b8f4a_7068_4513_93d2_275b7aff3ddf.slice/crio-a8b7e241b482028438c21248d188321ba3ebefc68679e55212f310c4b4289f82 WatchSource:0}: Error finding container a8b7e241b482028438c21248d188321ba3ebefc68679e55212f310c4b4289f82: Status 404 returned error can't find the container with id a8b7e241b482028438c21248d188321ba3ebefc68679e55212f310c4b4289f82 Apr 20 19:31:20.726566 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:20.726495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" event={"ID":"711b8f4a-7068-4513-93d2-275b7aff3ddf","Type":"ContainerStarted","Data":"a8b7e241b482028438c21248d188321ba3ebefc68679e55212f310c4b4289f82"} Apr 20 19:31:24.740616 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:24.740577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" event={"ID":"711b8f4a-7068-4513-93d2-275b7aff3ddf","Type":"ContainerStarted","Data":"e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8"} Apr 20 19:31:24.741092 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:24.740675 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:24.757073 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:24.757029 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" podStartSLOduration=2.159351311 podStartE2EDuration="5.757017397s" podCreationTimestamp="2026-04-20 19:31:19 +0000 UTC" firstStartedPulling="2026-04-20 19:31:20.551035099 +0000 UTC m=+645.265407164" lastFinishedPulling="2026-04-20 19:31:24.148701184 +0000 UTC m=+648.863073250" observedRunningTime="2026-04-20 19:31:24.755523078 +0000 UTC m=+649.469895159" watchObservedRunningTime="2026-04-20 19:31:24.757017397 +0000 UTC m=+649.471389485" Apr 20 19:31:35.744975 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:35.744943 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:37.018564 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.018532 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kjknv"] Apr 20 19:31:37.018951 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.018749 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" podUID="711b8f4a-7068-4513-93d2-275b7aff3ddf" containerName="limitador" containerID="cri-o://e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8" gracePeriod=30 Apr 20 19:31:37.550808 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.550785 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:37.697295 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.697209 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/711b8f4a-7068-4513-93d2-275b7aff3ddf-config-file\") pod \"711b8f4a-7068-4513-93d2-275b7aff3ddf\" (UID: \"711b8f4a-7068-4513-93d2-275b7aff3ddf\") " Apr 20 19:31:37.697295 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.697258 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqtz6\" (UniqueName: \"kubernetes.io/projected/711b8f4a-7068-4513-93d2-275b7aff3ddf-kube-api-access-nqtz6\") pod \"711b8f4a-7068-4513-93d2-275b7aff3ddf\" (UID: \"711b8f4a-7068-4513-93d2-275b7aff3ddf\") " Apr 20 19:31:37.697555 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.697530 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711b8f4a-7068-4513-93d2-275b7aff3ddf-config-file" (OuterVolumeSpecName: "config-file") pod "711b8f4a-7068-4513-93d2-275b7aff3ddf" (UID: "711b8f4a-7068-4513-93d2-275b7aff3ddf"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:31:37.699489 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.699453 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711b8f4a-7068-4513-93d2-275b7aff3ddf-kube-api-access-nqtz6" (OuterVolumeSpecName: "kube-api-access-nqtz6") pod "711b8f4a-7068-4513-93d2-275b7aff3ddf" (UID: "711b8f4a-7068-4513-93d2-275b7aff3ddf"). InnerVolumeSpecName "kube-api-access-nqtz6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:37.781400 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.781374 2577 generic.go:358] "Generic (PLEG): container finished" podID="711b8f4a-7068-4513-93d2-275b7aff3ddf" containerID="e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8" exitCode=0 Apr 20 19:31:37.781518 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.781435 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" Apr 20 19:31:37.781518 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.781462 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" event={"ID":"711b8f4a-7068-4513-93d2-275b7aff3ddf","Type":"ContainerDied","Data":"e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8"} Apr 20 19:31:37.781518 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.781505 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-kjknv" event={"ID":"711b8f4a-7068-4513-93d2-275b7aff3ddf","Type":"ContainerDied","Data":"a8b7e241b482028438c21248d188321ba3ebefc68679e55212f310c4b4289f82"} Apr 20 19:31:37.781685 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.781526 2577 scope.go:117] "RemoveContainer" containerID="e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8" Apr 20 19:31:37.789435 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.789418 2577 scope.go:117] "RemoveContainer" containerID="e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8" Apr 20 19:31:37.789677 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:31:37.789660 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8\": container with ID starting with e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8 not found: ID does not exist" containerID="e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8" Apr 20 19:31:37.789763 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.789684 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8"} err="failed to get container status \"e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8\": rpc error: code = NotFound desc = could not find container \"e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8\": container with ID starting with e86ba660bdfd6e97b237f1b09d1faffe521c676c1ade402408582476d6862ea8 not found: ID does not exist" Apr 20 19:31:37.798384 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.798368 2577 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/711b8f4a-7068-4513-93d2-275b7aff3ddf-config-file\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:31:37.798434 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.798387 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nqtz6\" (UniqueName: \"kubernetes.io/projected/711b8f4a-7068-4513-93d2-275b7aff3ddf-kube-api-access-nqtz6\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:31:37.801536 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.801512 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kjknv"] Apr 20 19:31:37.804821 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.804801 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-kjknv"] Apr 20 19:31:37.834646 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:37.834623 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711b8f4a-7068-4513-93d2-275b7aff3ddf" path="/var/lib/kubelet/pods/711b8f4a-7068-4513-93d2-275b7aff3ddf/volumes" Apr 20 19:31:58.777546 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.777515 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-z2mc9"] Apr 20 19:31:58.778065 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.777971 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="711b8f4a-7068-4513-93d2-275b7aff3ddf" containerName="limitador" Apr 20 19:31:58.778065 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.777990 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="711b8f4a-7068-4513-93d2-275b7aff3ddf" containerName="limitador" Apr 20 19:31:58.778182 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.778089 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="711b8f4a-7068-4513-93d2-275b7aff3ddf" containerName="limitador" Apr 20 19:31:58.784791 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.784769 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" Apr 20 19:31:58.787431 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.787409 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-z2mc9"] Apr 20 19:31:58.787554 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.787435 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-9lgw8\"" Apr 20 19:31:58.851033 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.851005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5h2t\" (UniqueName: \"kubernetes.io/projected/bde41f01-4e8c-4e20-ba99-c1d9613c2a45-kube-api-access-v5h2t\") pod \"maas-controller-6d4c8f55f9-z2mc9\" (UID: \"bde41f01-4e8c-4e20-ba99-c1d9613c2a45\") " pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" Apr 20 19:31:58.929583 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.929551 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f4c846855-hl4mn"] Apr 20 19:31:58.932847 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.932831 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:31:58.941468 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.941445 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f4c846855-hl4mn"] Apr 20 19:31:58.951700 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.951675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ppbj\" (UniqueName: \"kubernetes.io/projected/6b212403-0fa0-4d03-905c-f4b9a1231c5f-kube-api-access-4ppbj\") pod \"maas-controller-f4c846855-hl4mn\" (UID: \"6b212403-0fa0-4d03-905c-f4b9a1231c5f\") " pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:31:58.951833 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.951738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h2t\" (UniqueName: \"kubernetes.io/projected/bde41f01-4e8c-4e20-ba99-c1d9613c2a45-kube-api-access-v5h2t\") pod \"maas-controller-6d4c8f55f9-z2mc9\" (UID: \"bde41f01-4e8c-4e20-ba99-c1d9613c2a45\") " pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" Apr 20 19:31:58.959825 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:58.959808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5h2t\" (UniqueName: \"kubernetes.io/projected/bde41f01-4e8c-4e20-ba99-c1d9613c2a45-kube-api-access-v5h2t\") pod \"maas-controller-6d4c8f55f9-z2mc9\" (UID: \"bde41f01-4e8c-4e20-ba99-c1d9613c2a45\") " pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" Apr 20 19:31:59.052387 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.052333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ppbj\" (UniqueName: \"kubernetes.io/projected/6b212403-0fa0-4d03-905c-f4b9a1231c5f-kube-api-access-4ppbj\") pod \"maas-controller-f4c846855-hl4mn\" (UID: \"6b212403-0fa0-4d03-905c-f4b9a1231c5f\") " pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:31:59.061925 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.061904 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-z2mc9"] Apr 20 19:31:59.062132 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.062118 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" Apr 20 19:31:59.063919 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.063893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ppbj\" (UniqueName: \"kubernetes.io/projected/6b212403-0fa0-4d03-905c-f4b9a1231c5f-kube-api-access-4ppbj\") pod \"maas-controller-f4c846855-hl4mn\" (UID: \"6b212403-0fa0-4d03-905c-f4b9a1231c5f\") " pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:31:59.090715 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.090694 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-84f6879458-488b8"] Apr 20 19:31:59.095386 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.095371 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:31:59.102429 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.102381 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-84f6879458-488b8"] Apr 20 19:31:59.153109 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.153080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k2z8\" (UniqueName: \"kubernetes.io/projected/8971c4e4-9626-423a-af58-ccd2453fc726-kube-api-access-5k2z8\") pod \"maas-controller-84f6879458-488b8\" (UID: \"8971c4e4-9626-423a-af58-ccd2453fc726\") " pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:31:59.185966 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.185940 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-z2mc9"] Apr 20 19:31:59.187931 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:31:59.187903 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde41f01_4e8c_4e20_ba99_c1d9613c2a45.slice/crio-500dd9d02c3837a3e509f92de3a3e9d3d0a0bf83579cb4c100469828e2665b13 WatchSource:0}: Error finding container 500dd9d02c3837a3e509f92de3a3e9d3d0a0bf83579cb4c100469828e2665b13: Status 404 returned error can't find the container with id 500dd9d02c3837a3e509f92de3a3e9d3d0a0bf83579cb4c100469828e2665b13 Apr 20 19:31:59.243626 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.243603 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:31:59.254519 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.254490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k2z8\" (UniqueName: \"kubernetes.io/projected/8971c4e4-9626-423a-af58-ccd2453fc726-kube-api-access-5k2z8\") pod \"maas-controller-84f6879458-488b8\" (UID: \"8971c4e4-9626-423a-af58-ccd2453fc726\") " pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:31:59.263082 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.263062 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k2z8\" (UniqueName: \"kubernetes.io/projected/8971c4e4-9626-423a-af58-ccd2453fc726-kube-api-access-5k2z8\") pod \"maas-controller-84f6879458-488b8\" (UID: \"8971c4e4-9626-423a-af58-ccd2453fc726\") " pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:31:59.359670 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.359641 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f4c846855-hl4mn"] Apr 20 19:31:59.362562 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:31:59.362527 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b212403_0fa0_4d03_905c_f4b9a1231c5f.slice/crio-aad7720837716de9a0d74dd101ca2c2dad85723596a254a768a0bc681ddf7b9c WatchSource:0}: Error finding container aad7720837716de9a0d74dd101ca2c2dad85723596a254a768a0bc681ddf7b9c: Status 404 returned error can't find the container with id aad7720837716de9a0d74dd101ca2c2dad85723596a254a768a0bc681ddf7b9c Apr 20 19:31:59.407582 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.407546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:31:59.519715 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.519691 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-84f6879458-488b8"] Apr 20 19:31:59.521655 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:31:59.521627 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8971c4e4_9626_423a_af58_ccd2453fc726.slice/crio-521e9c3ea3065b62876bc6a81e0e73a4dded35b162d703c79705bb20e8e3faaa WatchSource:0}: Error finding container 521e9c3ea3065b62876bc6a81e0e73a4dded35b162d703c79705bb20e8e3faaa: Status 404 returned error can't find the container with id 521e9c3ea3065b62876bc6a81e0e73a4dded35b162d703c79705bb20e8e3faaa Apr 20 19:31:59.855596 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.855553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f4c846855-hl4mn" event={"ID":"6b212403-0fa0-4d03-905c-f4b9a1231c5f","Type":"ContainerStarted","Data":"aad7720837716de9a0d74dd101ca2c2dad85723596a254a768a0bc681ddf7b9c"} Apr 20 19:31:59.856946 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.856906 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84f6879458-488b8" event={"ID":"8971c4e4-9626-423a-af58-ccd2453fc726","Type":"ContainerStarted","Data":"521e9c3ea3065b62876bc6a81e0e73a4dded35b162d703c79705bb20e8e3faaa"} Apr 20 19:31:59.858549 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:31:59.858519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" event={"ID":"bde41f01-4e8c-4e20-ba99-c1d9613c2a45","Type":"ContainerStarted","Data":"500dd9d02c3837a3e509f92de3a3e9d3d0a0bf83579cb4c100469828e2665b13"} Apr 20 19:32:02.872678 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.872639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84f6879458-488b8" event={"ID":"8971c4e4-9626-423a-af58-ccd2453fc726","Type":"ContainerStarted","Data":"d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484"} Apr 20 19:32:02.873126 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.872760 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:32:02.874128 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.874066 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" podUID="bde41f01-4e8c-4e20-ba99-c1d9613c2a45" containerName="manager" containerID="cri-o://45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b" gracePeriod=10 Apr 20 19:32:02.874268 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.874071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" event={"ID":"bde41f01-4e8c-4e20-ba99-c1d9613c2a45","Type":"ContainerStarted","Data":"45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b"} Apr 20 19:32:02.874268 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.874168 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" Apr 20 19:32:02.875484 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.875457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f4c846855-hl4mn" event={"ID":"6b212403-0fa0-4d03-905c-f4b9a1231c5f","Type":"ContainerStarted","Data":"3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37"} Apr 20 19:32:02.875598 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.875583 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:32:02.889567 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.889528 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-84f6879458-488b8" podStartSLOduration=0.998078028 podStartE2EDuration="3.889517029s" podCreationTimestamp="2026-04-20 19:31:59 +0000 UTC" firstStartedPulling="2026-04-20 19:31:59.522943469 +0000 UTC m=+684.237315535" lastFinishedPulling="2026-04-20 19:32:02.41438247 +0000 UTC m=+687.128754536" observedRunningTime="2026-04-20 19:32:02.888208743 +0000 UTC m=+687.602580832" watchObservedRunningTime="2026-04-20 19:32:02.889517029 +0000 UTC m=+687.603889095" Apr 20 19:32:02.909234 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.909181 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" podStartSLOduration=1.683608815 podStartE2EDuration="4.909169093s" podCreationTimestamp="2026-04-20 19:31:58 +0000 UTC" firstStartedPulling="2026-04-20 19:31:59.189134329 +0000 UTC m=+683.903506396" lastFinishedPulling="2026-04-20 19:32:02.414694609 +0000 UTC m=+687.129066674" observedRunningTime="2026-04-20 19:32:02.907594562 +0000 UTC m=+687.621966651" watchObservedRunningTime="2026-04-20 19:32:02.909169093 +0000 UTC m=+687.623541180" Apr 20 19:32:02.925262 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:02.925216 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f4c846855-hl4mn" podStartSLOduration=1.867976615 podStartE2EDuration="4.92520182s" podCreationTimestamp="2026-04-20 19:31:58 +0000 UTC" firstStartedPulling="2026-04-20 19:31:59.363923578 +0000 UTC m=+684.078295648" lastFinishedPulling="2026-04-20 19:32:02.421148772 +0000 UTC m=+687.135520853" observedRunningTime="2026-04-20 19:32:02.923496357 +0000 UTC m=+687.637868447" watchObservedRunningTime="2026-04-20 19:32:02.92520182 +0000 UTC m=+687.639573908" Apr 20 19:32:03.108856 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.108834 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" Apr 20 19:32:03.190062 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.189982 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5h2t\" (UniqueName: \"kubernetes.io/projected/bde41f01-4e8c-4e20-ba99-c1d9613c2a45-kube-api-access-v5h2t\") pod \"bde41f01-4e8c-4e20-ba99-c1d9613c2a45\" (UID: \"bde41f01-4e8c-4e20-ba99-c1d9613c2a45\") " Apr 20 19:32:03.192086 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.192059 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde41f01-4e8c-4e20-ba99-c1d9613c2a45-kube-api-access-v5h2t" (OuterVolumeSpecName: "kube-api-access-v5h2t") pod "bde41f01-4e8c-4e20-ba99-c1d9613c2a45" (UID: "bde41f01-4e8c-4e20-ba99-c1d9613c2a45"). InnerVolumeSpecName "kube-api-access-v5h2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:32:03.290756 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.290711 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5h2t\" (UniqueName: \"kubernetes.io/projected/bde41f01-4e8c-4e20-ba99-c1d9613c2a45-kube-api-access-v5h2t\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:32:03.879365 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.879335 2577 generic.go:358] "Generic (PLEG): container finished" podID="bde41f01-4e8c-4e20-ba99-c1d9613c2a45" containerID="45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b" exitCode=0 Apr 20 19:32:03.879711 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.879381 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" Apr 20 19:32:03.879711 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.879407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" event={"ID":"bde41f01-4e8c-4e20-ba99-c1d9613c2a45","Type":"ContainerDied","Data":"45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b"} Apr 20 19:32:03.879711 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.879438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-z2mc9" event={"ID":"bde41f01-4e8c-4e20-ba99-c1d9613c2a45","Type":"ContainerDied","Data":"500dd9d02c3837a3e509f92de3a3e9d3d0a0bf83579cb4c100469828e2665b13"} Apr 20 19:32:03.879711 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.879458 2577 scope.go:117] "RemoveContainer" containerID="45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b" Apr 20 19:32:03.886960 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.886941 2577 scope.go:117] "RemoveContainer" containerID="45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b" Apr 20 19:32:03.887220 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:32:03.887204 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b\": container with ID starting with 45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b not found: ID does not exist" containerID="45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b" Apr 20 19:32:03.887264 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.887229 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b"} err="failed to get container status \"45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b\": rpc error: code = NotFound desc = could not find container \"45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b\": container with ID starting with 45e600ad28dbdb3acde2560acc0b7741e5011194d05661b20146829f6995e26b not found: ID does not exist" Apr 20 19:32:03.895175 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.895156 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-z2mc9"] Apr 20 19:32:03.899162 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:03.899142 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-z2mc9"] Apr 20 19:32:04.777972 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.777944 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-79c5c4c5cf-vgfcx"] Apr 20 19:32:04.778303 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.778289 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bde41f01-4e8c-4e20-ba99-c1d9613c2a45" containerName="manager" Apr 20 19:32:04.778369 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.778305 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde41f01-4e8c-4e20-ba99-c1d9613c2a45" containerName="manager" Apr 20 19:32:04.778406 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.778378 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="bde41f01-4e8c-4e20-ba99-c1d9613c2a45" containerName="manager" Apr 20 19:32:04.782221 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.782203 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:04.785314 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.785294 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-ssfr9\"" Apr 20 19:32:04.785413 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.785311 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 19:32:04.785413 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.785292 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 19:32:04.791873 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.791849 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-79c5c4c5cf-vgfcx"] Apr 20 19:32:04.905620 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.905592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a64ca67-0dde-43ce-99b7-068526223bc8-maas-api-tls\") pod \"maas-api-79c5c4c5cf-vgfcx\" (UID: \"7a64ca67-0dde-43ce-99b7-068526223bc8\") " pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:04.906031 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:04.905795 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd7p8\" (UniqueName: \"kubernetes.io/projected/7a64ca67-0dde-43ce-99b7-068526223bc8-kube-api-access-xd7p8\") pod \"maas-api-79c5c4c5cf-vgfcx\" (UID: \"7a64ca67-0dde-43ce-99b7-068526223bc8\") " pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:05.006312 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:05.006288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd7p8\" (UniqueName: \"kubernetes.io/projected/7a64ca67-0dde-43ce-99b7-068526223bc8-kube-api-access-xd7p8\") pod \"maas-api-79c5c4c5cf-vgfcx\" (UID: \"7a64ca67-0dde-43ce-99b7-068526223bc8\") " pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:05.006449 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:05.006329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a64ca67-0dde-43ce-99b7-068526223bc8-maas-api-tls\") pod \"maas-api-79c5c4c5cf-vgfcx\" (UID: \"7a64ca67-0dde-43ce-99b7-068526223bc8\") " pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:05.008499 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:05.008470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a64ca67-0dde-43ce-99b7-068526223bc8-maas-api-tls\") pod \"maas-api-79c5c4c5cf-vgfcx\" (UID: \"7a64ca67-0dde-43ce-99b7-068526223bc8\") " pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:05.014150 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:05.014126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd7p8\" (UniqueName: \"kubernetes.io/projected/7a64ca67-0dde-43ce-99b7-068526223bc8-kube-api-access-xd7p8\") pod \"maas-api-79c5c4c5cf-vgfcx\" (UID: \"7a64ca67-0dde-43ce-99b7-068526223bc8\") " pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:05.099795 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:05.099742 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:05.219011 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:05.218989 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-79c5c4c5cf-vgfcx"] Apr 20 19:32:05.220880 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:32:05.220852 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a64ca67_0dde_43ce_99b7_068526223bc8.slice/crio-87e27ff27517cc08a3a6adf2eb092a49d2f501bb4e2c27c14a7d60230ffffc5b WatchSource:0}: Error finding container 87e27ff27517cc08a3a6adf2eb092a49d2f501bb4e2c27c14a7d60230ffffc5b: Status 404 returned error can't find the container with id 87e27ff27517cc08a3a6adf2eb092a49d2f501bb4e2c27c14a7d60230ffffc5b Apr 20 19:32:05.835800 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:05.835758 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde41f01-4e8c-4e20-ba99-c1d9613c2a45" path="/var/lib/kubelet/pods/bde41f01-4e8c-4e20-ba99-c1d9613c2a45/volumes" Apr 20 19:32:05.888491 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:05.888442 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" event={"ID":"7a64ca67-0dde-43ce-99b7-068526223bc8","Type":"ContainerStarted","Data":"87e27ff27517cc08a3a6adf2eb092a49d2f501bb4e2c27c14a7d60230ffffc5b"} Apr 20 19:32:06.893432 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:06.893398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" event={"ID":"7a64ca67-0dde-43ce-99b7-068526223bc8","Type":"ContainerStarted","Data":"8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe"} Apr 20 19:32:06.893838 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:06.893546 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:06.909055 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:06.909007 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" podStartSLOduration=1.8440878509999998 podStartE2EDuration="2.908993013s" podCreationTimestamp="2026-04-20 19:32:04 +0000 UTC" firstStartedPulling="2026-04-20 19:32:05.222624197 +0000 UTC m=+689.936996263" lastFinishedPulling="2026-04-20 19:32:06.287529356 +0000 UTC m=+691.001901425" observedRunningTime="2026-04-20 19:32:06.908132829 +0000 UTC m=+691.622504917" watchObservedRunningTime="2026-04-20 19:32:06.908993013 +0000 UTC m=+691.623365101" Apr 20 19:32:12.902280 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:12.902252 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:12.918024 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:12.917990 2577 generic.go:358] "Generic (PLEG): container finished" podID="8971c4e4-9626-423a-af58-ccd2453fc726" containerID="d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484" exitCode=1 Apr 20 19:32:12.918213 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:12.918052 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84f6879458-488b8" event={"ID":"8971c4e4-9626-423a-af58-ccd2453fc726","Type":"ContainerDied","Data":"d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484"} Apr 20 19:32:12.918391 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:12.918377 2577 scope.go:117] "RemoveContainer" containerID="d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484" Apr 20 19:32:13.885325 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:13.885298 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:32:13.922607 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:13.922575 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84f6879458-488b8" event={"ID":"8971c4e4-9626-423a-af58-ccd2453fc726","Type":"ContainerStarted","Data":"cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06"} Apr 20 19:32:13.923010 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:13.922763 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:32:24.930923 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:24.930889 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:32:24.971592 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:24.971564 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f4c846855-hl4mn"] Apr 20 19:32:24.971804 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:24.971782 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-f4c846855-hl4mn" podUID="6b212403-0fa0-4d03-905c-f4b9a1231c5f" containerName="manager" containerID="cri-o://3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37" gracePeriod=10 Apr 20 19:32:25.205799 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.205776 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:32:25.265557 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.265534 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ppbj\" (UniqueName: \"kubernetes.io/projected/6b212403-0fa0-4d03-905c-f4b9a1231c5f-kube-api-access-4ppbj\") pod \"6b212403-0fa0-4d03-905c-f4b9a1231c5f\" (UID: \"6b212403-0fa0-4d03-905c-f4b9a1231c5f\") " Apr 20 19:32:25.267580 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.267555 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b212403-0fa0-4d03-905c-f4b9a1231c5f-kube-api-access-4ppbj" (OuterVolumeSpecName: "kube-api-access-4ppbj") pod "6b212403-0fa0-4d03-905c-f4b9a1231c5f" (UID: "6b212403-0fa0-4d03-905c-f4b9a1231c5f"). InnerVolumeSpecName "kube-api-access-4ppbj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:32:25.311862 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.311839 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f579989cf-nzc6t"] Apr 20 19:32:25.312157 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.312145 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b212403-0fa0-4d03-905c-f4b9a1231c5f" containerName="manager" Apr 20 19:32:25.312200 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.312159 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b212403-0fa0-4d03-905c-f4b9a1231c5f" containerName="manager" Apr 20 19:32:25.312237 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.312219 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b212403-0fa0-4d03-905c-f4b9a1231c5f" containerName="manager" Apr 20 19:32:25.315195 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.315179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:32:25.320696 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.320674 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f579989cf-nzc6t"] Apr 20 19:32:25.366517 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.366496 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4ppbj\" (UniqueName: \"kubernetes.io/projected/6b212403-0fa0-4d03-905c-f4b9a1231c5f-kube-api-access-4ppbj\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:32:25.467871 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.467802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkqm2\" (UniqueName: \"kubernetes.io/projected/34efbb5a-b469-45ea-a731-c36736884ce7-kube-api-access-hkqm2\") pod \"maas-controller-f579989cf-nzc6t\" (UID: \"34efbb5a-b469-45ea-a731-c36736884ce7\") " pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:32:25.568533 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.568507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkqm2\" (UniqueName: \"kubernetes.io/projected/34efbb5a-b469-45ea-a731-c36736884ce7-kube-api-access-hkqm2\") pod \"maas-controller-f579989cf-nzc6t\" (UID: \"34efbb5a-b469-45ea-a731-c36736884ce7\") " pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:32:25.576538 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.576512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkqm2\" (UniqueName: \"kubernetes.io/projected/34efbb5a-b469-45ea-a731-c36736884ce7-kube-api-access-hkqm2\") pod \"maas-controller-f579989cf-nzc6t\" (UID: \"34efbb5a-b469-45ea-a731-c36736884ce7\") " pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:32:25.626050 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.626026 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:32:25.739955 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.739929 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f579989cf-nzc6t"] Apr 20 19:32:25.742264 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:32:25.742237 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34efbb5a_b469_45ea_a731_c36736884ce7.slice/crio-acd5e2fd5aa00b9aaedf38b49d01fc769e89494376b39ad7c1b9d87275a95ff8 WatchSource:0}: Error finding container acd5e2fd5aa00b9aaedf38b49d01fc769e89494376b39ad7c1b9d87275a95ff8: Status 404 returned error can't find the container with id acd5e2fd5aa00b9aaedf38b49d01fc769e89494376b39ad7c1b9d87275a95ff8 Apr 20 19:32:25.962051 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.962025 2577 generic.go:358] "Generic (PLEG): container finished" podID="6b212403-0fa0-4d03-905c-f4b9a1231c5f" containerID="3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37" exitCode=0 Apr 20 19:32:25.962403 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.962088 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f4c846855-hl4mn" Apr 20 19:32:25.962403 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.962107 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f4c846855-hl4mn" event={"ID":"6b212403-0fa0-4d03-905c-f4b9a1231c5f","Type":"ContainerDied","Data":"3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37"} Apr 20 19:32:25.962403 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.962151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f4c846855-hl4mn" event={"ID":"6b212403-0fa0-4d03-905c-f4b9a1231c5f","Type":"ContainerDied","Data":"aad7720837716de9a0d74dd101ca2c2dad85723596a254a768a0bc681ddf7b9c"} Apr 20 19:32:25.962403 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.962173 2577 scope.go:117] "RemoveContainer" containerID="3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37" Apr 20 19:32:25.963185 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.963163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f579989cf-nzc6t" event={"ID":"34efbb5a-b469-45ea-a731-c36736884ce7","Type":"ContainerStarted","Data":"acd5e2fd5aa00b9aaedf38b49d01fc769e89494376b39ad7c1b9d87275a95ff8"} Apr 20 19:32:25.971685 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.971237 2577 scope.go:117] "RemoveContainer" containerID="3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37" Apr 20 19:32:25.971863 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:32:25.971843 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37\": container with ID starting with 3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37 not found: ID does not exist" containerID="3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37" Apr 20 19:32:25.971920 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.971876 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37"} err="failed to get container status \"3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37\": rpc error: code = NotFound desc = could not find container \"3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37\": container with ID starting with 3a10d91c8d36352b8ec7590120dbf8dea0c878e5c3843bcf333909c9adf86d37 not found: ID does not exist" Apr 20 19:32:25.978576 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.978556 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f4c846855-hl4mn"] Apr 20 19:32:25.982227 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:25.982207 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-f4c846855-hl4mn"] Apr 20 19:32:26.967705 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:26.967666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f579989cf-nzc6t" event={"ID":"34efbb5a-b469-45ea-a731-c36736884ce7","Type":"ContainerStarted","Data":"e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868"} Apr 20 19:32:26.968145 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:26.967774 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:32:26.984767 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:26.984708 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f579989cf-nzc6t" podStartSLOduration=1.688411452 podStartE2EDuration="1.984696762s" podCreationTimestamp="2026-04-20 19:32:25 +0000 UTC" firstStartedPulling="2026-04-20 19:32:25.74337021 +0000 UTC m=+710.457742276" lastFinishedPulling="2026-04-20 19:32:26.039655521 +0000 UTC m=+710.754027586" observedRunningTime="2026-04-20 19:32:26.983773976 +0000 UTC m=+711.698146062" watchObservedRunningTime="2026-04-20 19:32:26.984696762 +0000 UTC m=+711.699068875" Apr 20 19:32:27.835585 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:27.835548 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b212403-0fa0-4d03-905c-f4b9a1231c5f" path="/var/lib/kubelet/pods/6b212403-0fa0-4d03-905c-f4b9a1231c5f/volumes" Apr 20 19:32:37.977084 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:37.977057 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:32:38.018769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:38.018741 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-84f6879458-488b8"] Apr 20 19:32:38.018970 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:38.018952 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-84f6879458-488b8" podUID="8971c4e4-9626-423a-af58-ccd2453fc726" containerName="manager" containerID="cri-o://cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06" gracePeriod=10 Apr 20 19:32:38.248511 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:38.248482 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:32:38.364038 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:38.364014 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k2z8\" (UniqueName: \"kubernetes.io/projected/8971c4e4-9626-423a-af58-ccd2453fc726-kube-api-access-5k2z8\") pod \"8971c4e4-9626-423a-af58-ccd2453fc726\" (UID: \"8971c4e4-9626-423a-af58-ccd2453fc726\") " Apr 20 19:32:38.365961 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:38.365930 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8971c4e4-9626-423a-af58-ccd2453fc726-kube-api-access-5k2z8" (OuterVolumeSpecName: "kube-api-access-5k2z8") pod "8971c4e4-9626-423a-af58-ccd2453fc726" (UID: "8971c4e4-9626-423a-af58-ccd2453fc726"). InnerVolumeSpecName "kube-api-access-5k2z8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:32:38.465092 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:38.465057 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5k2z8\" (UniqueName: \"kubernetes.io/projected/8971c4e4-9626-423a-af58-ccd2453fc726-kube-api-access-5k2z8\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:32:39.006042 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.006013 2577 generic.go:358] "Generic (PLEG): container finished" podID="8971c4e4-9626-423a-af58-ccd2453fc726" containerID="cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06" exitCode=0 Apr 20 19:32:39.006446 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.006075 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-84f6879458-488b8" Apr 20 19:32:39.006446 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.006090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84f6879458-488b8" event={"ID":"8971c4e4-9626-423a-af58-ccd2453fc726","Type":"ContainerDied","Data":"cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06"} Apr 20 19:32:39.006446 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.006123 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-84f6879458-488b8" event={"ID":"8971c4e4-9626-423a-af58-ccd2453fc726","Type":"ContainerDied","Data":"521e9c3ea3065b62876bc6a81e0e73a4dded35b162d703c79705bb20e8e3faaa"} Apr 20 19:32:39.006446 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.006139 2577 scope.go:117] "RemoveContainer" containerID="cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06" Apr 20 19:32:39.014186 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.014169 2577 scope.go:117] "RemoveContainer" containerID="d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484" Apr 20 19:32:39.020972 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.020851 2577 scope.go:117] "RemoveContainer" containerID="cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06" Apr 20 19:32:39.021091 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:32:39.021073 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06\": container with ID starting with cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06 not found: ID does not exist" containerID="cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06" Apr 20 19:32:39.021129 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.021098 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06"} err="failed to get container status \"cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06\": rpc error: code = NotFound desc = could not find container \"cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06\": container with ID starting with cac78170833dbd227b6df0e67ae6c956bad2b2811429ab0a56027ea09a1ecb06 not found: ID does not exist" Apr 20 19:32:39.021129 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.021117 2577 scope.go:117] "RemoveContainer" containerID="d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484" Apr 20 19:32:39.021323 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:32:39.021308 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484\": container with ID starting with d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484 not found: ID does not exist" containerID="d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484" Apr 20 19:32:39.021358 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.021329 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484"} err="failed to get container status \"d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484\": rpc error: code = NotFound desc = could not find container \"d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484\": container with ID starting with d2b6fe12e20bb506de7d312f8cb3bfbcd4327ecf9fb72296a2aab517de323484 not found: ID does not exist" Apr 20 19:32:39.040765 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.040744 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-84f6879458-488b8"] Apr 20 19:32:39.046779 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.046757 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-84f6879458-488b8"] Apr 20 19:32:39.835607 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:39.835562 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8971c4e4-9626-423a-af58-ccd2453fc726" path="/var/lib/kubelet/pods/8971c4e4-9626-423a-af58-ccd2453fc726/volumes" Apr 20 19:32:44.079168 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.079133 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6b79b6d4c6-mwrs9"] Apr 20 19:32:44.079535 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.079462 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8971c4e4-9626-423a-af58-ccd2453fc726" containerName="manager" Apr 20 19:32:44.079535 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.079475 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8971c4e4-9626-423a-af58-ccd2453fc726" containerName="manager" Apr 20 19:32:44.079535 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.079500 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8971c4e4-9626-423a-af58-ccd2453fc726" containerName="manager" Apr 20 19:32:44.079535 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.079509 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8971c4e4-9626-423a-af58-ccd2453fc726" containerName="manager" Apr 20 19:32:44.079672 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.079590 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8971c4e4-9626-423a-af58-ccd2453fc726" containerName="manager" Apr 20 19:32:44.082166 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.082150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:44.089977 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.089956 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6b79b6d4c6-mwrs9"] Apr 20 19:32:44.213752 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.213711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8-maas-api-tls\") pod \"maas-api-6b79b6d4c6-mwrs9\" (UID: \"f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8\") " pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:44.213882 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.213776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xrx\" (UniqueName: \"kubernetes.io/projected/f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8-kube-api-access-k4xrx\") pod \"maas-api-6b79b6d4c6-mwrs9\" (UID: \"f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8\") " pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:44.315073 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.315034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8-maas-api-tls\") pod \"maas-api-6b79b6d4c6-mwrs9\" (UID: \"f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8\") " pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:44.315221 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.315092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xrx\" (UniqueName: \"kubernetes.io/projected/f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8-kube-api-access-k4xrx\") pod \"maas-api-6b79b6d4c6-mwrs9\" (UID: \"f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8\") " pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:44.317405 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.317383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8-maas-api-tls\") pod \"maas-api-6b79b6d4c6-mwrs9\" (UID: \"f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8\") " pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:44.322171 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.322139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xrx\" (UniqueName: \"kubernetes.io/projected/f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8-kube-api-access-k4xrx\") pod \"maas-api-6b79b6d4c6-mwrs9\" (UID: \"f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8\") " pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:44.393507 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.393447 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:44.714792 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:44.714710 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6b79b6d4c6-mwrs9"] Apr 20 19:32:44.718136 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:32:44.718110 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5ec4b99_b0e0_47e9_afe1_0a76bb8008b8.slice/crio-cef5d32c7a852dd7e18052001e309b8226d37372e5d6c65c4a404d1ff2a2c6f7 WatchSource:0}: Error finding container cef5d32c7a852dd7e18052001e309b8226d37372e5d6c65c4a404d1ff2a2c6f7: Status 404 returned error can't find the container with id cef5d32c7a852dd7e18052001e309b8226d37372e5d6c65c4a404d1ff2a2c6f7 Apr 20 19:32:45.029339 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:45.029300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" event={"ID":"f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8","Type":"ContainerStarted","Data":"cef5d32c7a852dd7e18052001e309b8226d37372e5d6c65c4a404d1ff2a2c6f7"} Apr 20 19:32:47.036694 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:47.036661 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" event={"ID":"f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8","Type":"ContainerStarted","Data":"b03eb76dc0f5d7a6008850c81c08ce09e6e091e740e61c2d617db533015c7b7a"} Apr 20 19:32:47.037074 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:47.036707 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:47.054607 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:47.054563 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" podStartSLOduration=1.632171151 podStartE2EDuration="3.054549236s" podCreationTimestamp="2026-04-20 19:32:44 +0000 UTC" firstStartedPulling="2026-04-20 19:32:44.719588145 +0000 UTC m=+729.433960211" lastFinishedPulling="2026-04-20 19:32:46.141966227 +0000 UTC m=+730.856338296" observedRunningTime="2026-04-20 19:32:47.05331603 +0000 UTC m=+731.767688129" watchObservedRunningTime="2026-04-20 19:32:47.054549236 +0000 UTC m=+731.768921357" Apr 20 19:32:49.171801 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.171761 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h"] Apr 20 19:32:49.172336 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.172317 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8971c4e4-9626-423a-af58-ccd2453fc726" containerName="manager" Apr 20 19:32:49.174271 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.174250 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.177545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.177518 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 19:32:49.177545 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.177531 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 19:32:49.177707 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.177566 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-9dkrq\"" Apr 20 19:32:49.177821 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.177805 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 19:32:49.182701 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.182683 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h"] Apr 20 19:32:49.254608 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.254577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/766682ad-b0b2-4fc1-b672-e89eb2e81728-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.254788 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.254621 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.254788 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.254647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.254788 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.254765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgh6d\" (UniqueName: \"kubernetes.io/projected/766682ad-b0b2-4fc1-b672-e89eb2e81728-kube-api-access-lgh6d\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.254964 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.254821 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.254964 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.254899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355204 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355175 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355358 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/766682ad-b0b2-4fc1-b672-e89eb2e81728-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355358 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355358 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355358 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgh6d\" (UniqueName: \"kubernetes.io/projected/766682ad-b0b2-4fc1-b672-e89eb2e81728-kube-api-access-lgh6d\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355358 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355617 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355749 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.355801 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.355718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.357515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.357492 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/766682ad-b0b2-4fc1-b672-e89eb2e81728-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.357759 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.357743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/766682ad-b0b2-4fc1-b672-e89eb2e81728-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.362409 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.362388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgh6d\" (UniqueName: \"kubernetes.io/projected/766682ad-b0b2-4fc1-b672-e89eb2e81728-kube-api-access-lgh6d\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h\" (UID: \"766682ad-b0b2-4fc1-b672-e89eb2e81728\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.485512 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.485483 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:32:49.602502 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:49.602478 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h"] Apr 20 19:32:49.604245 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:32:49.604218 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod766682ad_b0b2_4fc1_b672_e89eb2e81728.slice/crio-cc0aeee0bcc3ba79e5ab586ed67a434bdb35dfbbdc45a8e25cd105cd229ba795 WatchSource:0}: Error finding container cc0aeee0bcc3ba79e5ab586ed67a434bdb35dfbbdc45a8e25cd105cd229ba795: Status 404 returned error can't find the container with id cc0aeee0bcc3ba79e5ab586ed67a434bdb35dfbbdc45a8e25cd105cd229ba795 Apr 20 19:32:50.047712 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.047676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" event={"ID":"766682ad-b0b2-4fc1-b672-e89eb2e81728","Type":"ContainerStarted","Data":"cc0aeee0bcc3ba79e5ab586ed67a434bdb35dfbbdc45a8e25cd105cd229ba795"} Apr 20 19:32:50.066767 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.066743 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64"] Apr 20 19:32:50.069828 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.069812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.071804 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.071786 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 19:32:50.078009 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.077987 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64"] Apr 20 19:32:50.162489 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.162454 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.162645 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.162525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.162645 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.162547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-526dz\" (UniqueName: \"kubernetes.io/projected/16593762-f52f-4f3e-a29c-ba18fa425e4c-kube-api-access-526dz\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.162645 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.162566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16593762-f52f-4f3e-a29c-ba18fa425e4c-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.162645 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.162586 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.162645 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.162626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263185 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263568 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-526dz\" (UniqueName: \"kubernetes.io/projected/16593762-f52f-4f3e-a29c-ba18fa425e4c-kube-api-access-526dz\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263568 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16593762-f52f-4f3e-a29c-ba18fa425e4c-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263568 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263568 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263568 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263860 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263860 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.263860 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.263682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.265706 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.265680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/16593762-f52f-4f3e-a29c-ba18fa425e4c-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.266084 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.266061 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16593762-f52f-4f3e-a29c-ba18fa425e4c-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.277289 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.277268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-526dz\" (UniqueName: \"kubernetes.io/projected/16593762-f52f-4f3e-a29c-ba18fa425e4c-kube-api-access-526dz\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64\" (UID: \"16593762-f52f-4f3e-a29c-ba18fa425e4c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.381192 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.381116 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:32:50.509252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:50.509195 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64"] Apr 20 19:32:50.513203 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:32:50.513171 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16593762_f52f_4f3e_a29c_ba18fa425e4c.slice/crio-80f5cca8e580955cd2230b9e9b03c111c5956fda3afcf6479af75ae86bd2a371 WatchSource:0}: Error finding container 80f5cca8e580955cd2230b9e9b03c111c5956fda3afcf6479af75ae86bd2a371: Status 404 returned error can't find the container with id 80f5cca8e580955cd2230b9e9b03c111c5956fda3afcf6479af75ae86bd2a371 Apr 20 19:32:51.055472 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:51.055416 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" event={"ID":"16593762-f52f-4f3e-a29c-ba18fa425e4c","Type":"ContainerStarted","Data":"80f5cca8e580955cd2230b9e9b03c111c5956fda3afcf6479af75ae86bd2a371"} Apr 20 19:32:53.045355 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.045329 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6b79b6d4c6-mwrs9" Apr 20 19:32:53.088006 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.087975 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-79c5c4c5cf-vgfcx"] Apr 20 19:32:53.088292 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.088255 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" podUID="7a64ca67-0dde-43ce-99b7-068526223bc8" containerName="maas-api" containerID="cri-o://8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe" gracePeriod=30 Apr 20 19:32:53.366829 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.366749 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:53.389639 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.389608 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd7p8\" (UniqueName: \"kubernetes.io/projected/7a64ca67-0dde-43ce-99b7-068526223bc8-kube-api-access-xd7p8\") pod \"7a64ca67-0dde-43ce-99b7-068526223bc8\" (UID: \"7a64ca67-0dde-43ce-99b7-068526223bc8\") " Apr 20 19:32:53.389798 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.389715 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a64ca67-0dde-43ce-99b7-068526223bc8-maas-api-tls\") pod \"7a64ca67-0dde-43ce-99b7-068526223bc8\" (UID: \"7a64ca67-0dde-43ce-99b7-068526223bc8\") " Apr 20 19:32:53.391606 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.391579 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a64ca67-0dde-43ce-99b7-068526223bc8-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "7a64ca67-0dde-43ce-99b7-068526223bc8" (UID: "7a64ca67-0dde-43ce-99b7-068526223bc8"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:32:53.391690 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.391635 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a64ca67-0dde-43ce-99b7-068526223bc8-kube-api-access-xd7p8" (OuterVolumeSpecName: "kube-api-access-xd7p8") pod "7a64ca67-0dde-43ce-99b7-068526223bc8" (UID: "7a64ca67-0dde-43ce-99b7-068526223bc8"). InnerVolumeSpecName "kube-api-access-xd7p8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:32:53.491252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.491207 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xd7p8\" (UniqueName: \"kubernetes.io/projected/7a64ca67-0dde-43ce-99b7-068526223bc8-kube-api-access-xd7p8\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:32:53.491252 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:53.491248 2577 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a64ca67-0dde-43ce-99b7-068526223bc8-maas-api-tls\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:32:54.069297 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:54.069255 2577 generic.go:358] "Generic (PLEG): container finished" podID="7a64ca67-0dde-43ce-99b7-068526223bc8" containerID="8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe" exitCode=0 Apr 20 19:32:54.069773 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:54.069367 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" Apr 20 19:32:54.069773 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:54.069352 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" event={"ID":"7a64ca67-0dde-43ce-99b7-068526223bc8","Type":"ContainerDied","Data":"8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe"} Apr 20 19:32:54.069773 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:54.069516 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79c5c4c5cf-vgfcx" event={"ID":"7a64ca67-0dde-43ce-99b7-068526223bc8","Type":"ContainerDied","Data":"87e27ff27517cc08a3a6adf2eb092a49d2f501bb4e2c27c14a7d60230ffffc5b"} Apr 20 19:32:54.069773 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:54.069556 2577 scope.go:117] "RemoveContainer" containerID="8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe" Apr 20 19:32:54.084621 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:54.084584 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-79c5c4c5cf-vgfcx"] Apr 20 19:32:54.090481 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:54.090455 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-79c5c4c5cf-vgfcx"] Apr 20 19:32:55.836983 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:55.836939 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a64ca67-0dde-43ce-99b7-068526223bc8" path="/var/lib/kubelet/pods/7a64ca67-0dde-43ce-99b7-068526223bc8/volumes" Apr 20 19:32:56.447487 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:56.447247 2577 scope.go:117] "RemoveContainer" containerID="8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe" Apr 20 19:32:56.447592 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:32:56.447571 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe\": container with ID starting with 8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe not found: ID does not exist" containerID="8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe" Apr 20 19:32:56.447634 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:56.447604 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe"} err="failed to get container status \"8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe\": rpc error: code = NotFound desc = could not find container \"8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe\": container with ID starting with 8d64b2fa8e34c22e6638b2c92204855f13031fb56555c0ae99b7a67fe82089fe not found: ID does not exist" Apr 20 19:32:57.082702 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:57.082658 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" event={"ID":"16593762-f52f-4f3e-a29c-ba18fa425e4c","Type":"ContainerStarted","Data":"f3170eb92fc205d0da3d7b9186435c7024c7e50d8502bb9e4a40a631dea8988a"} Apr 20 19:32:57.084958 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:32:57.084936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" event={"ID":"766682ad-b0b2-4fc1-b672-e89eb2e81728","Type":"ContainerStarted","Data":"e6102fbfe570231d16c58d778944834217ad2ab1d0c4682ce288f9bf35305428"} Apr 20 19:33:02.106210 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:02.106179 2577 generic.go:358] "Generic (PLEG): container finished" podID="766682ad-b0b2-4fc1-b672-e89eb2e81728" containerID="e6102fbfe570231d16c58d778944834217ad2ab1d0c4682ce288f9bf35305428" exitCode=0 Apr 20 19:33:02.106603 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:02.106259 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" event={"ID":"766682ad-b0b2-4fc1-b672-e89eb2e81728","Type":"ContainerDied","Data":"e6102fbfe570231d16c58d778944834217ad2ab1d0c4682ce288f9bf35305428"} Apr 20 19:33:02.107612 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:02.107591 2577 generic.go:358] "Generic (PLEG): container finished" podID="16593762-f52f-4f3e-a29c-ba18fa425e4c" containerID="f3170eb92fc205d0da3d7b9186435c7024c7e50d8502bb9e4a40a631dea8988a" exitCode=0 Apr 20 19:33:02.107700 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:02.107640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" event={"ID":"16593762-f52f-4f3e-a29c-ba18fa425e4c","Type":"ContainerDied","Data":"f3170eb92fc205d0da3d7b9186435c7024c7e50d8502bb9e4a40a631dea8988a"} Apr 20 19:33:04.115911 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:04.115867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" event={"ID":"766682ad-b0b2-4fc1-b672-e89eb2e81728","Type":"ContainerStarted","Data":"eedda076681fcf842ab5ce34c8fcf4cbe5064e8a5c1bc95ce61798a195ed9aa9"} Apr 20 19:33:04.116353 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:04.116118 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:33:04.117468 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:04.117446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" event={"ID":"16593762-f52f-4f3e-a29c-ba18fa425e4c","Type":"ContainerStarted","Data":"1e5ecd36c5379e5cfeb950ab49e15fc09ec8c801d09100b008d555c439f5e188"} Apr 20 19:33:04.117645 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:04.117629 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:33:04.134014 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:04.133969 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" podStartSLOduration=1.443762144 podStartE2EDuration="15.133957404s" podCreationTimestamp="2026-04-20 19:32:49 +0000 UTC" firstStartedPulling="2026-04-20 19:32:49.606093023 +0000 UTC m=+734.320465092" lastFinishedPulling="2026-04-20 19:33:03.296288273 +0000 UTC m=+748.010660352" observedRunningTime="2026-04-20 19:33:04.132845154 +0000 UTC m=+748.847217252" watchObservedRunningTime="2026-04-20 19:33:04.133957404 +0000 UTC m=+748.848329513" Apr 20 19:33:04.151279 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:04.151240 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" podStartSLOduration=1.366485938 podStartE2EDuration="14.151229221s" podCreationTimestamp="2026-04-20 19:32:50 +0000 UTC" firstStartedPulling="2026-04-20 19:32:50.515507907 +0000 UTC m=+735.229879976" lastFinishedPulling="2026-04-20 19:33:03.30025119 +0000 UTC m=+748.014623259" observedRunningTime="2026-04-20 19:33:04.150230473 +0000 UTC m=+748.864602561" watchObservedRunningTime="2026-04-20 19:33:04.151229221 +0000 UTC m=+748.865601308" Apr 20 19:33:15.132881 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:15.132850 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h" Apr 20 19:33:15.133620 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:15.133596 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64" Apr 20 19:33:19.974361 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:19.974320 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc"] Apr 20 19:33:19.974769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:19.974641 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a64ca67-0dde-43ce-99b7-068526223bc8" containerName="maas-api" Apr 20 19:33:19.974769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:19.974652 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a64ca67-0dde-43ce-99b7-068526223bc8" containerName="maas-api" Apr 20 19:33:19.974769 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:19.974719 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a64ca67-0dde-43ce-99b7-068526223bc8" containerName="maas-api" Apr 20 19:33:19.977472 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:19.977453 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:19.980633 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:19.980614 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 19:33:19.987541 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:19.987523 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc"] Apr 20 19:33:20.120902 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.120861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.120902 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.120901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.121095 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.120929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gwwj\" (UniqueName: \"kubernetes.io/projected/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-kube-api-access-4gwwj\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.121095 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.120985 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.121095 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.120999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.121095 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.121013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222211 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222377 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222377 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gwwj\" (UniqueName: \"kubernetes.io/projected/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-kube-api-access-4gwwj\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222377 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222377 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222529 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222690 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222757 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222684 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.222799 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.222780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.224556 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.224502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.224827 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.224811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.229828 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.229799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gwwj\" (UniqueName: \"kubernetes.io/projected/9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2-kube-api-access-4gwwj\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc\" (UID: \"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.286599 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.286570 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:20.404273 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:20.404251 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc"] Apr 20 19:33:20.406394 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:33:20.406361 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d22ccb9_be01_4a8a_9c66_fbfaa2ddf2f2.slice/crio-fad206d7d1ae295393afcd128b3c2584ec147c191bc8c2003041e7288ffc16ca WatchSource:0}: Error finding container fad206d7d1ae295393afcd128b3c2584ec147c191bc8c2003041e7288ffc16ca: Status 404 returned error can't find the container with id fad206d7d1ae295393afcd128b3c2584ec147c191bc8c2003041e7288ffc16ca Apr 20 19:33:21.170807 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:21.170768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" event={"ID":"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2","Type":"ContainerStarted","Data":"00162274da76c439e2f8eb0c7ab150025bb233df3df42361681fece4ce4a4ba3"} Apr 20 19:33:21.171229 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:21.170812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" event={"ID":"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2","Type":"ContainerStarted","Data":"fad206d7d1ae295393afcd128b3c2584ec147c191bc8c2003041e7288ffc16ca"} Apr 20 19:33:29.200408 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:29.200367 2577 generic.go:358] "Generic (PLEG): container finished" podID="9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2" containerID="00162274da76c439e2f8eb0c7ab150025bb233df3df42361681fece4ce4a4ba3" exitCode=0 Apr 20 19:33:29.200775 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:29.200436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" event={"ID":"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2","Type":"ContainerDied","Data":"00162274da76c439e2f8eb0c7ab150025bb233df3df42361681fece4ce4a4ba3"} Apr 20 19:33:29.201030 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:29.201014 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:33:30.206205 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:30.206173 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" event={"ID":"9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2","Type":"ContainerStarted","Data":"5c4a256520b49d8faab8d3c73c9b7e9d9505798e3a17dd206087f479a3062922"} Apr 20 19:33:30.206614 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:30.206364 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:33:30.223962 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:30.223917 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" podStartSLOduration=10.999129633 podStartE2EDuration="11.223905295s" podCreationTimestamp="2026-04-20 19:33:19 +0000 UTC" firstStartedPulling="2026-04-20 19:33:29.201125651 +0000 UTC m=+773.915497718" lastFinishedPulling="2026-04-20 19:33:29.425901303 +0000 UTC m=+774.140273380" observedRunningTime="2026-04-20 19:33:30.221790263 +0000 UTC m=+774.936162341" watchObservedRunningTime="2026-04-20 19:33:30.223905295 +0000 UTC m=+774.938277382" Apr 20 19:33:41.222044 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:33:41.222017 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc" Apr 20 19:35:35.801478 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:35:35.801447 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:35:35.803895 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:35:35.803873 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:36:04.149238 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.149202 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f579989cf-nzc6t"] Apr 20 19:36:04.149688 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.149447 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-f579989cf-nzc6t" podUID="34efbb5a-b469-45ea-a731-c36736884ce7" containerName="manager" containerID="cri-o://e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868" gracePeriod=10 Apr 20 19:36:04.387470 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.387446 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:36:04.400626 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.400569 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkqm2\" (UniqueName: \"kubernetes.io/projected/34efbb5a-b469-45ea-a731-c36736884ce7-kube-api-access-hkqm2\") pod \"34efbb5a-b469-45ea-a731-c36736884ce7\" (UID: \"34efbb5a-b469-45ea-a731-c36736884ce7\") " Apr 20 19:36:04.402471 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.402444 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34efbb5a-b469-45ea-a731-c36736884ce7-kube-api-access-hkqm2" (OuterVolumeSpecName: "kube-api-access-hkqm2") pod "34efbb5a-b469-45ea-a731-c36736884ce7" (UID: "34efbb5a-b469-45ea-a731-c36736884ce7"). InnerVolumeSpecName "kube-api-access-hkqm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:36:04.501714 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.501687 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkqm2\" (UniqueName: \"kubernetes.io/projected/34efbb5a-b469-45ea-a731-c36736884ce7-kube-api-access-hkqm2\") on node \"ip-10-0-133-149.ec2.internal\" DevicePath \"\"" Apr 20 19:36:04.711280 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.711198 2577 generic.go:358] "Generic (PLEG): container finished" podID="34efbb5a-b469-45ea-a731-c36736884ce7" containerID="e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868" exitCode=0 Apr 20 19:36:04.711280 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.711266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f579989cf-nzc6t" event={"ID":"34efbb5a-b469-45ea-a731-c36736884ce7","Type":"ContainerDied","Data":"e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868"} Apr 20 19:36:04.711280 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.711270 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f579989cf-nzc6t" Apr 20 19:36:04.711535 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.711290 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f579989cf-nzc6t" event={"ID":"34efbb5a-b469-45ea-a731-c36736884ce7","Type":"ContainerDied","Data":"acd5e2fd5aa00b9aaedf38b49d01fc769e89494376b39ad7c1b9d87275a95ff8"} Apr 20 19:36:04.711535 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.711306 2577 scope.go:117] "RemoveContainer" containerID="e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868" Apr 20 19:36:04.719639 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.719619 2577 scope.go:117] "RemoveContainer" containerID="e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868" Apr 20 19:36:04.719916 ip-10-0-133-149 kubenswrapper[2577]: E0420 19:36:04.719896 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868\": container with ID starting with e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868 not found: ID does not exist" containerID="e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868" Apr 20 19:36:04.719974 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.719925 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868"} err="failed to get container status \"e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868\": rpc error: code = NotFound desc = could not find container \"e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868\": container with ID starting with e864fb1d53b97738728d24b852dece37f395f5c3628f4a0939067b2f05056868 not found: ID does not exist" Apr 20 19:36:04.774452 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.774428 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f579989cf-nzc6t"] Apr 20 19:36:04.781239 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:04.781216 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-f579989cf-nzc6t"] Apr 20 19:36:05.553740 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.553704 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f579989cf-d5r7z"] Apr 20 19:36:05.554086 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.554066 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34efbb5a-b469-45ea-a731-c36736884ce7" containerName="manager" Apr 20 19:36:05.554086 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.554080 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="34efbb5a-b469-45ea-a731-c36736884ce7" containerName="manager" Apr 20 19:36:05.554164 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.554153 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="34efbb5a-b469-45ea-a731-c36736884ce7" containerName="manager" Apr 20 19:36:05.558334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.558315 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f579989cf-d5r7z" Apr 20 19:36:05.560700 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.560681 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-9lgw8\"" Apr 20 19:36:05.563824 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.563800 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f579989cf-d5r7z"] Apr 20 19:36:05.610188 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.610163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gw9\" (UniqueName: \"kubernetes.io/projected/3eb69b1a-3c1b-44ef-b5c0-7779c66ef030-kube-api-access-s7gw9\") pod \"maas-controller-f579989cf-d5r7z\" (UID: \"3eb69b1a-3c1b-44ef-b5c0-7779c66ef030\") " pod="opendatahub/maas-controller-f579989cf-d5r7z" Apr 20 19:36:05.710718 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.710694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gw9\" (UniqueName: \"kubernetes.io/projected/3eb69b1a-3c1b-44ef-b5c0-7779c66ef030-kube-api-access-s7gw9\") pod \"maas-controller-f579989cf-d5r7z\" (UID: \"3eb69b1a-3c1b-44ef-b5c0-7779c66ef030\") " pod="opendatahub/maas-controller-f579989cf-d5r7z" Apr 20 19:36:05.719021 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.719000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gw9\" (UniqueName: \"kubernetes.io/projected/3eb69b1a-3c1b-44ef-b5c0-7779c66ef030-kube-api-access-s7gw9\") pod \"maas-controller-f579989cf-d5r7z\" (UID: \"3eb69b1a-3c1b-44ef-b5c0-7779c66ef030\") " pod="opendatahub/maas-controller-f579989cf-d5r7z" Apr 20 19:36:05.834801 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.834719 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34efbb5a-b469-45ea-a731-c36736884ce7" path="/var/lib/kubelet/pods/34efbb5a-b469-45ea-a731-c36736884ce7/volumes" Apr 20 19:36:05.869239 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.869223 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f579989cf-d5r7z" Apr 20 19:36:05.993893 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:05.993869 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f579989cf-d5r7z"] Apr 20 19:36:05.995932 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:36:05.995907 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb69b1a_3c1b_44ef_b5c0_7779c66ef030.slice/crio-c8d3b86a750ec4f1f3d6272aebeafed33b3199e0f845c0c112eb660a59764801 WatchSource:0}: Error finding container c8d3b86a750ec4f1f3d6272aebeafed33b3199e0f845c0c112eb660a59764801: Status 404 returned error can't find the container with id c8d3b86a750ec4f1f3d6272aebeafed33b3199e0f845c0c112eb660a59764801 Apr 20 19:36:06.719311 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:06.719233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f579989cf-d5r7z" event={"ID":"3eb69b1a-3c1b-44ef-b5c0-7779c66ef030","Type":"ContainerStarted","Data":"3f4cf79e725f2e6a07d3394ac773749ed4b55ff16ae9f94588a083a7eb3e93c3"} Apr 20 19:36:06.719311 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:06.719268 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f579989cf-d5r7z" event={"ID":"3eb69b1a-3c1b-44ef-b5c0-7779c66ef030","Type":"ContainerStarted","Data":"c8d3b86a750ec4f1f3d6272aebeafed33b3199e0f845c0c112eb660a59764801"} Apr 20 19:36:06.719683 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:06.719372 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f579989cf-d5r7z" Apr 20 19:36:06.735370 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:06.735318 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f579989cf-d5r7z" podStartSLOduration=1.380469408 podStartE2EDuration="1.735305555s" podCreationTimestamp="2026-04-20 19:36:05 +0000 UTC" firstStartedPulling="2026-04-20 19:36:05.997443109 +0000 UTC m=+930.711815176" lastFinishedPulling="2026-04-20 19:36:06.352279245 +0000 UTC m=+931.066651323" observedRunningTime="2026-04-20 19:36:06.733337724 +0000 UTC m=+931.447709809" watchObservedRunningTime="2026-04-20 19:36:06.735305555 +0000 UTC m=+931.449677643" Apr 20 19:36:17.727854 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:36:17.727820 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f579989cf-d5r7z" Apr 20 19:40:35.824420 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:40:35.824390 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:40:35.828002 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:40:35.827980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:45:35.851454 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:45:35.851428 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:45:35.855707 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:45:35.855687 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:50:35.874500 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:50:35.874473 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:50:35.881720 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:50:35.881694 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:55:35.897967 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:55:35.897940 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:55:35.914865 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:55:35.914840 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:57:08.322740 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:08.322694 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6b79b6d4c6-mwrs9_f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8/maas-api/0.log" Apr 20 19:57:08.446054 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:08.446032 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-f579989cf-d5r7z_3eb69b1a-3c1b-44ef-b5c0-7779c66ef030/manager/0.log" Apr 20 19:57:08.695995 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:08.695919 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7875d57869-6x8vq_9227d6c3-5736-4590-b1af-eaf1d30a0b56/manager/0.log" Apr 20 19:57:10.731358 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:10.731319 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-92jsm_17ec7f02-9904-49e3-9f2e-554d1406846b/kuadrant-console-plugin/0.log" Apr 20 19:57:11.801688 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:11.801655 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-65b68d668c-pdqzs_4b61770f-53f3-445c-833a-79a6399688f0/kube-auth-proxy/0.log" Apr 20 19:57:12.635577 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:12.635531 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64_16593762-f52f-4f3e-a29c-ba18fa425e4c/main/0.log" Apr 20 19:57:12.642859 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:12.642837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-wsx64_16593762-f52f-4f3e-a29c-ba18fa425e4c/storage-initializer/0.log" Apr 20 19:57:12.876974 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:12.876938 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h_766682ad-b0b2-4fc1-b672-e89eb2e81728/storage-initializer/0.log" Apr 20 19:57:12.885397 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:12.885368 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjv85h_766682ad-b0b2-4fc1-b672-e89eb2e81728/main/0.log" Apr 20 19:57:13.136310 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:13.136264 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc_9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2/storage-initializer/0.log" Apr 20 19:57:13.145528 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:13.145501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-5pmkc_9d22ccb9-be01-4a8a-9c66-fbfaa2ddf2f2/main/0.log" Apr 20 19:57:20.665716 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:20.665684 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9r8xq_17c6e5a1-3d98-4126-b48d-b3e384ab3179/global-pull-secret-syncer/0.log" Apr 20 19:57:20.790928 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:20.790896 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gzt79_8372aa91-c5a0-4714-939b-8dc6743d0b72/konnectivity-agent/0.log" Apr 20 19:57:20.885533 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:20.885486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-149.ec2.internal_562c22904357368d150bcfb5b4deac02/haproxy/0.log" Apr 20 19:57:26.030643 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:26.030532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-92jsm_17ec7f02-9904-49e3-9f2e-554d1406846b/kuadrant-console-plugin/0.log" Apr 20 19:57:27.963406 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:27.963378 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-524jk_d9241b80-47a6-4cf4-8485-01b585082093/cluster-monitoring-operator/0.log" Apr 20 19:57:28.074886 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.074862 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-v8rqf_060b5a16-fbe2-4e36-b398-85d8ac1178a9/monitoring-plugin/0.log" Apr 20 19:57:28.235762 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.235717 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vxd2g_4a07410c-4c3a-40a9-955d-2fc040fffc3a/node-exporter/0.log" Apr 20 19:57:28.252398 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.252374 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vxd2g_4a07410c-4c3a-40a9-955d-2fc040fffc3a/kube-rbac-proxy/0.log" Apr 20 19:57:28.269575 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.269554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vxd2g_4a07410c-4c3a-40a9-955d-2fc040fffc3a/init-textfile/0.log" Apr 20 19:57:28.372274 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.372248 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7a33ca71-8d9f-45c6-bb56-aab488691412/prometheus/0.log" Apr 20 19:57:28.389638 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.389602 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7a33ca71-8d9f-45c6-bb56-aab488691412/config-reloader/0.log" Apr 20 19:57:28.408050 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.408025 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7a33ca71-8d9f-45c6-bb56-aab488691412/thanos-sidecar/0.log" Apr 20 19:57:28.426919 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.426895 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7a33ca71-8d9f-45c6-bb56-aab488691412/kube-rbac-proxy-web/0.log" Apr 20 19:57:28.449143 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.449121 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7a33ca71-8d9f-45c6-bb56-aab488691412/kube-rbac-proxy/0.log" Apr 20 19:57:28.468026 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.467997 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7a33ca71-8d9f-45c6-bb56-aab488691412/kube-rbac-proxy-thanos/0.log" Apr 20 19:57:28.489055 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:28.488988 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7a33ca71-8d9f-45c6-bb56-aab488691412/init-config-reloader/0.log" Apr 20 19:57:29.564675 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.564644 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll"] Apr 20 19:57:29.567963 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.567940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.570222 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.570203 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5pdh\"/\"kube-root-ca.crt\"" Apr 20 19:57:29.570331 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.570244 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r5pdh\"/\"default-dockercfg-cwh22\"" Apr 20 19:57:29.570331 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.570253 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5pdh\"/\"openshift-service-ca.crt\"" Apr 20 19:57:29.576408 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.576385 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll"] Apr 20 19:57:29.674692 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.674653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-proc\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.674692 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.674694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9zp\" (UniqueName: \"kubernetes.io/projected/91103ff4-d086-47f6-ab86-e1941c8849c4-kube-api-access-xn9zp\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.674868 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.674767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-lib-modules\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.674868 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.674804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-sys\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.674868 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.674831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-podres\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776065 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-podres\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776163 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-proc\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776163 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9zp\" (UniqueName: \"kubernetes.io/projected/91103ff4-d086-47f6-ab86-e1941c8849c4-kube-api-access-xn9zp\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776163 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-lib-modules\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776278 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-sys\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776278 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776194 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-podres\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776278 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-proc\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776278 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-sys\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.776404 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.776286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91103ff4-d086-47f6-ab86-e1941c8849c4-lib-modules\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.783832 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.783812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9zp\" (UniqueName: \"kubernetes.io/projected/91103ff4-d086-47f6-ab86-e1941c8849c4-kube-api-access-xn9zp\") pod \"perf-node-gather-daemonset-n47ll\" (UID: \"91103ff4-d086-47f6-ab86-e1941c8849c4\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:29.878472 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:29.878405 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:30.000759 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:30.000715 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll"] Apr 20 19:57:30.002589 ip-10-0-133-149 kubenswrapper[2577]: W0420 19:57:30.002563 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod91103ff4_d086_47f6_ab86_e1941c8849c4.slice/crio-623f3a5aecf5035e6033a25d849f7e877f85979986605fe9556b9e889633fe91 WatchSource:0}: Error finding container 623f3a5aecf5035e6033a25d849f7e877f85979986605fe9556b9e889633fe91: Status 404 returned error can't find the container with id 623f3a5aecf5035e6033a25d849f7e877f85979986605fe9556b9e889633fe91 Apr 20 19:57:30.004169 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:30.004146 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:57:30.846939 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:30.846874 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" event={"ID":"91103ff4-d086-47f6-ab86-e1941c8849c4","Type":"ContainerStarted","Data":"f3057ca81515495400101e5f0c54e7787395980d9212035e356df0dfeb6fe3b6"} Apr 20 19:57:30.846939 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:30.846937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" event={"ID":"91103ff4-d086-47f6-ab86-e1941c8849c4","Type":"ContainerStarted","Data":"623f3a5aecf5035e6033a25d849f7e877f85979986605fe9556b9e889633fe91"} Apr 20 19:57:30.847457 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:30.846985 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:30.862222 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:30.862167 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" podStartSLOduration=1.86214929 podStartE2EDuration="1.86214929s" podCreationTimestamp="2026-04-20 19:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:57:30.861211063 +0000 UTC m=+2215.575583151" watchObservedRunningTime="2026-04-20 19:57:30.86214929 +0000 UTC m=+2215.576521381" Apr 20 19:57:32.096972 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:32.096942 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qksj4_00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd/dns/0.log" Apr 20 19:57:32.116567 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:32.116540 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qksj4_00ad0cd4-ddc3-4fb0-8b71-18eaf617c1cd/kube-rbac-proxy/0.log" Apr 20 19:57:32.185075 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:32.185049 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lnjzz_bd768ad4-6493-4653-aa46-ff5c53a0532e/dns-node-resolver/0.log" Apr 20 19:57:32.629868 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:32.629840 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-85d95c7f6-4v9jj_7eefee4f-85f2-4490-9054-b8484ab1a66f/registry/0.log" Apr 20 19:57:32.687368 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:32.687342 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rnzz5_8f8d204a-6287-475e-8bb2-4e2081ea3788/node-ca/0.log" Apr 20 19:57:33.631302 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:33.631273 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-65b68d668c-pdqzs_4b61770f-53f3-445c-833a-79a6399688f0/kube-auth-proxy/0.log" Apr 20 19:57:34.274437 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:34.274407 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7cqnx_4a717388-605c-4d9d-8381-4bbf7fe371fb/serve-healthcheck-canary/0.log" Apr 20 19:57:34.794079 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:34.794041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-tjv47_cf27dde3-1580-4f60-ad2f-abd6f261c5c1/insights-operator/0.log" Apr 20 19:57:34.797645 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:34.797623 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-tjv47_cf27dde3-1580-4f60-ad2f-abd6f261c5c1/insights-operator/1.log" Apr 20 19:57:34.932625 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:34.932595 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8pzkw_ff1dc025-5bbe-4675-8b44-c791098ecbb6/kube-rbac-proxy/0.log" Apr 20 19:57:34.951619 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:34.951601 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8pzkw_ff1dc025-5bbe-4675-8b44-c791098ecbb6/exporter/0.log" Apr 20 19:57:34.970666 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:34.970644 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8pzkw_ff1dc025-5bbe-4675-8b44-c791098ecbb6/extractor/0.log" Apr 20 19:57:36.860166 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:36.860136 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-n47ll" Apr 20 19:57:36.899375 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:36.899341 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6b79b6d4c6-mwrs9_f5ec4b99-b0e0-47e9-afe1-0a76bb8008b8/maas-api/0.log" Apr 20 19:57:36.973564 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:36.973528 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-f579989cf-d5r7z_3eb69b1a-3c1b-44ef-b5c0-7779c66ef030/manager/0.log" Apr 20 19:57:37.022393 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:37.022348 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7875d57869-6x8vq_9227d6c3-5736-4590-b1af-eaf1d30a0b56/manager/0.log" Apr 20 19:57:38.333051 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:38.333023 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5c6db948fd-2h2ld_4973e77a-7c03-43d9-8a24-133776787912/manager/0.log" Apr 20 19:57:42.870047 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:42.870014 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-b5p57_ebdf0ce7-38ad-46f6-a8af-493c326f2cfb/migrator/0.log" Apr 20 19:57:42.890038 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:42.890008 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-b5p57_ebdf0ce7-38ad-46f6-a8af-493c326f2cfb/graceful-termination/0.log" Apr 20 19:57:43.254938 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:43.254907 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-t96hc_c19d268b-1a81-44d2-9b22-adc4e7ec01d0/kube-storage-version-migrator-operator/1.log" Apr 20 19:57:43.256515 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:43.256484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-t96hc_c19d268b-1a81-44d2-9b22-adc4e7ec01d0/kube-storage-version-migrator-operator/0.log" Apr 20 19:57:44.409857 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.409786 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k75h7_94a9964b-f6a5-4b72-8989-1efbd67f430d/kube-multus-additional-cni-plugins/0.log" Apr 20 19:57:44.428562 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.428524 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k75h7_94a9964b-f6a5-4b72-8989-1efbd67f430d/egress-router-binary-copy/0.log" Apr 20 19:57:44.445869 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.445850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k75h7_94a9964b-f6a5-4b72-8989-1efbd67f430d/cni-plugins/0.log" Apr 20 19:57:44.463482 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.463465 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k75h7_94a9964b-f6a5-4b72-8989-1efbd67f430d/bond-cni-plugin/0.log" Apr 20 19:57:44.481230 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.481211 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k75h7_94a9964b-f6a5-4b72-8989-1efbd67f430d/routeoverride-cni/0.log" Apr 20 19:57:44.499967 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.499948 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k75h7_94a9964b-f6a5-4b72-8989-1efbd67f430d/whereabouts-cni-bincopy/0.log" Apr 20 19:57:44.517932 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.517913 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k75h7_94a9964b-f6a5-4b72-8989-1efbd67f430d/whereabouts-cni/0.log" Apr 20 19:57:44.719435 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.719362 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fdm6h_92c4c570-25df-4201-b0cf-3fc5e5d442d8/kube-multus/0.log" Apr 20 19:57:44.777055 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.777020 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tssws_39c06111-8b7a-4d9f-a3de-f5c655ac387d/network-metrics-daemon/0.log" Apr 20 19:57:44.794334 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:44.794303 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tssws_39c06111-8b7a-4d9f-a3de-f5c655ac387d/kube-rbac-proxy/0.log" Apr 20 19:57:46.191453 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.191424 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-controller/0.log" Apr 20 19:57:46.205574 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.205544 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/0.log" Apr 20 19:57:46.224569 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.224545 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovn-acl-logging/1.log" Apr 20 19:57:46.248120 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.248100 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/kube-rbac-proxy-node/0.log" Apr 20 19:57:46.267562 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.267540 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 19:57:46.282075 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.282043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/northd/0.log" Apr 20 19:57:46.300228 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.300205 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/nbdb/0.log" Apr 20 19:57:46.319464 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.319448 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/sbdb/0.log" Apr 20 19:57:46.512562 ip-10-0-133-149 kubenswrapper[2577]: I0420 19:57:46.512534 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rhxmj_ec1d5da3-6144-4314-be21-f06f578325c6/ovnkube-controller/0.log"