Apr 20 21:12:58.503715 ip-10-0-132-45 systemd[1]: Starting Kubernetes Kubelet... Apr 20 21:12:58.885677 ip-10-0-132-45 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:12:58.885677 ip-10-0-132-45 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 21:12:58.885677 ip-10-0-132-45 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:12:58.885677 ip-10-0-132-45 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 21:12:58.885677 ip-10-0-132-45 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:12:58.888246 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.888174 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 21:12:58.891714 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891700 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:58.891714 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891714 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891717 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891721 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891724 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891727 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891729 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891732 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891736 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891738 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891747 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891750 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891753 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891755 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891758 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891761 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891769 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891772 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891774 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891777 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891779 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:58.891778 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891782 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891785 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891788 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891791 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891793 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891796 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891799 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891801 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891803 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891806 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891808 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891811 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891814 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891816 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891819 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891821 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891824 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891826 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891829 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891831 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:58.892245 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891833 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891838 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891847 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891850 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891853 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891855 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891858 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891861 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891863 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891866 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891869 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891871 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891874 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891876 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891880 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891883 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891885 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891888 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891891 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:58.892805 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891893 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891896 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891900 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891904 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891906 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891912 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891915 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891918 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891920 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891924 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891926 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891929 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891931 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891934 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891937 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891939 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891947 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891950 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891952 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891955 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:58.893284 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891957 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891959 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891962 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891964 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891967 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.891969 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892395 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892401 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892404 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892407 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892409 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892412 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892415 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892417 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892435 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892439 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892441 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892444 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892447 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:58.893779 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892450 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892452 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892455 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892457 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892460 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892462 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892465 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892467 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892470 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892479 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892481 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892484 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892487 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892489 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892491 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892494 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892497 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892500 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892502 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892505 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:58.894235 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892507 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892510 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892512 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892515 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892517 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892520 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892522 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892525 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892527 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892530 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892532 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892534 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892537 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892539 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892542 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892544 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892547 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892549 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892552 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:58.894747 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892555 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892559 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892561 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892572 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892575 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892577 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892579 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892582 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892585 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892590 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892592 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892595 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892598 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892601 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892603 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892606 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892608 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892611 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892613 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:58.895208 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892616 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892618 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892620 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892623 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892625 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892628 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892631 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892634 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892636 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892639 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892643 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892646 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892648 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892651 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.892653 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893171 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893180 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893197 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893201 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893205 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893208 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 21:12:58.895688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893213 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893217 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893220 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893223 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893227 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893230 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893233 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893236 2567 flags.go:64] FLAG: --cgroup-root="" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893239 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893242 2567 flags.go:64] FLAG: --client-ca-file="" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893245 2567 flags.go:64] FLAG: --cloud-config="" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893248 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893250 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893257 2567 flags.go:64] FLAG: --cluster-domain="" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893260 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893263 2567 flags.go:64] FLAG: --config-dir="" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893266 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893269 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893273 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893276 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893279 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893284 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893287 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893290 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 21:12:58.896200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893293 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893296 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893299 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893303 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893306 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893314 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893318 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893321 2567 flags.go:64] FLAG: --enable-server="true" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893324 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893331 2567 flags.go:64] FLAG: --event-burst="100" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893335 2567 flags.go:64] FLAG: --event-qps="50" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893338 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893341 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893344 2567 flags.go:64] FLAG: --eviction-hard="" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893347 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893350 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893353 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893356 2567 flags.go:64] FLAG: --eviction-soft="" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893359 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893362 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893364 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893368 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893370 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893374 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893376 2567 flags.go:64] FLAG: --feature-gates="" Apr 20 21:12:58.896900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893380 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893383 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893386 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893389 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893393 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893396 2567 flags.go:64] FLAG: --help="false" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893399 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-132-45.ec2.internal" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893402 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893405 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893408 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893411 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893415 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893418 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893439 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893443 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893446 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893449 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893452 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893455 2567 flags.go:64] FLAG: --kube-reserved="" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893458 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893461 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893464 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893467 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893469 2567 flags.go:64] FLAG: --lock-file="" Apr 20 21:12:58.897533 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893472 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893475 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893478 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893483 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893486 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893489 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893492 2567 flags.go:64] FLAG: --logging-format="text" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893495 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893498 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893500 2567 flags.go:64] FLAG: --manifest-url="" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893503 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893507 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893511 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893515 2567 flags.go:64] FLAG: --max-pods="110" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893518 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893521 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893524 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893527 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893530 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893533 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893536 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893543 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893551 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893555 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 21:12:58.898148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893558 2567 flags.go:64] FLAG: --pod-cidr="" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893560 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893565 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893568 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893571 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893574 2567 flags.go:64] FLAG: --port="10250" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893577 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893580 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00ea5d23ed7daf0ce" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893583 2567 flags.go:64] FLAG: --qos-reserved="" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893585 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893588 2567 flags.go:64] FLAG: --register-node="true" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893591 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893594 2567 flags.go:64] FLAG: --register-with-taints="" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893597 2567 flags.go:64] FLAG: --registry-burst="10" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893600 2567 flags.go:64] FLAG: --registry-qps="5" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893603 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893605 2567 flags.go:64] FLAG: --reserved-memory="" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893609 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893612 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893615 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893619 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893622 2567 flags.go:64] FLAG: --runonce="false" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893624 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893627 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893630 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 20 21:12:58.898970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893633 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893636 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893639 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893642 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893645 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893648 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893656 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893659 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893662 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893665 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893668 2567 flags.go:64] FLAG: --system-cgroups="" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893670 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893676 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893678 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893681 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893688 2567 flags.go:64] FLAG: --tls-min-version="" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893691 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893693 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893696 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893699 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893702 2567 flags.go:64] FLAG: --v="2" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893706 2567 flags.go:64] FLAG: --version="false" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893713 2567 flags.go:64] FLAG: --vmodule="" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893717 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.893721 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 21:12:58.899804 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893860 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893864 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893869 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893872 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893875 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893877 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893880 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893883 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893886 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893889 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893891 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893894 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893896 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893899 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893902 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893905 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893907 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893910 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893912 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893915 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:58.900398 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893917 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893920 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893922 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893925 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893927 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893930 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893932 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893934 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893937 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893939 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893942 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893944 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893947 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893949 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893953 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893956 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893958 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893961 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893963 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893966 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:58.900931 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893969 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893971 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893975 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893977 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893980 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893982 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893985 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893989 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893991 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893993 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.893997 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894000 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894003 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894006 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894008 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894010 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894013 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894015 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894018 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:58.901476 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894020 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894022 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894025 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894027 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894030 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894032 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894035 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894038 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894041 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894043 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894045 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894048 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894053 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894056 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894058 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894061 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894063 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894066 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894068 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894071 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:58.901945 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894073 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894077 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894079 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894082 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894084 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894088 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.894092 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.894096 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.901440 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.901456 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901504 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901508 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901512 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901515 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901518 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:58.902453 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901521 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901525 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901528 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901531 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901534 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901537 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901540 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901543 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901545 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901548 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901550 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901553 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901555 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901558 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901561 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901563 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901566 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901568 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901571 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901574 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:58.902854 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901577 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901579 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901582 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901585 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901588 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901590 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901593 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901596 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901598 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901601 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901603 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901606 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901609 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901611 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901614 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901616 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901619 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901621 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901623 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:58.903385 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901626 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901628 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901632 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901636 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901639 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901642 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901644 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901647 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901650 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901652 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901655 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901658 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901660 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901663 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901666 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901668 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901672 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901674 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901677 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:58.903870 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901679 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901682 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901684 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901687 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901689 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901692 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901694 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901697 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901700 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901703 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901705 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901707 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901710 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901713 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901716 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901718 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901721 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901723 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901726 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901728 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:58.904358 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901731 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901733 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901738 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.901745 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901845 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901851 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901854 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901857 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901860 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901862 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901865 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901868 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901870 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901873 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901876 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:12:58.904911 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901880 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901883 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901886 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901888 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901891 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901894 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901896 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901899 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901901 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901904 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901907 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901910 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901913 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901915 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901918 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901921 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901923 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901926 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901928 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901931 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:12:58.905297 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901933 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901936 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901939 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901941 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901944 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901947 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901949 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901952 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901955 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901958 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901960 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901963 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901965 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901968 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901971 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901973 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901976 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901978 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901980 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901983 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:12:58.905901 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901985 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901988 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901990 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901993 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901996 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.901999 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902001 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902004 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902006 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902009 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902011 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902014 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902016 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902018 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902021 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902023 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902026 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902028 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902031 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:12:58.906456 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902034 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902036 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902039 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902041 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902044 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902046 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902049 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902051 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902054 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902056 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902060 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902064 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902067 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902070 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902072 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:12:58.907037 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:12:58.902075 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:12:58.907604 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.902080 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:12:58.907604 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.902699 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 21:12:58.907604 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.904655 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 21:12:58.907604 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.905365 2567 server.go:1019] "Starting client certificate rotation" Apr 20 21:12:58.907604 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.905462 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:12:58.907604 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.905517 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:12:58.925835 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.925817 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:12:58.927832 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.927807 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:12:58.944748 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.944585 2567 log.go:25] "Validated CRI v1 runtime API" Apr 20 21:12:58.949500 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.949486 2567 log.go:25] "Validated CRI v1 image API" Apr 20 21:12:58.950636 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.950619 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 21:12:58.954210 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.954190 2567 fs.go:135] Filesystem UUIDs: map[026175f5-d54a-44b4-8e4a-0eb810234ceb:/dev/nvme0n1p4 07d12c34-d027-4587-8f35-8cdf9a8d7e9b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 21:12:58.954271 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.954209 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 21:12:58.955810 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.955794 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:12:58.959493 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.959377 2567 manager.go:217] Machine: {Timestamp:2026-04-20 21:12:58.957762562 +0000 UTC m=+0.350948094 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101817 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d1ac164a707d8f70f3aa4b169e487 SystemUUID:ec2d1ac1-64a7-07d8-f70f-3aa4b169e487 BootID:f3bce254-c4db-4018-87e8-173b682b5ca6 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5f:0b:83:23:b1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5f:0b:83:23:b1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:c3:46:71:34:4b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 21:12:58.959493 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.959488 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 21:12:58.959600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.959558 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 21:12:58.960432 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.960405 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 21:12:58.960573 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.960436 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-45.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 21:12:58.960621 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.960582 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 21:12:58.960621 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.960591 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 21:12:58.960621 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.960607 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:12:58.961327 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.961316 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:12:58.961999 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.961988 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:12:58.962102 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.962093 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 21:12:58.964065 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.964056 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 20 21:12:58.964105 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.964069 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 21:12:58.964105 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.964080 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 21:12:58.964105 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.964091 2567 kubelet.go:397] "Adding apiserver pod source" Apr 20 21:12:58.964105 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.964098 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 21:12:58.965089 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.965077 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:12:58.965131 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.965097 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:12:58.967442 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.967414 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 21:12:58.968635 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.968622 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 21:12:58.970236 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970215 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 21:12:58.970355 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970246 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 21:12:58.970355 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970280 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 21:12:58.970355 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970295 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 21:12:58.970355 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970303 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 21:12:58.970355 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970312 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 21:12:58.970355 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970320 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 21:12:58.970355 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970348 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 21:12:58.970355 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970357 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 21:12:58.970624 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970366 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 21:12:58.970624 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970388 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 21:12:58.970624 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.970401 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 21:12:58.971953 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.971940 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 21:12:58.972023 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.971956 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 21:12:58.975409 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.975393 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 21:12:58.975515 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.975482 2567 server.go:1295] "Started kubelet" Apr 20 21:12:58.975651 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.975585 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 21:12:58.976177 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.976115 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 21:12:58.976256 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.976198 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 21:12:58.976320 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.976304 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-45.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 21:12:58.976369 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:58.976193 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 21:12:58.976360 ip-10-0-132-45 systemd[1]: Started Kubernetes Kubelet. Apr 20 21:12:58.976493 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:58.976411 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-45.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 21:12:58.976889 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.976798 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7mfjf" Apr 20 21:12:58.977367 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.977352 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 21:12:58.978483 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.978469 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 20 21:12:58.983347 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.983328 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 21:12:58.983821 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.983796 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 21:12:58.984297 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:58.983181 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-45.ec2.internal.18a82d0b2a4fb190 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-45.ec2.internal,UID:ip-10-0-132-45.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-45.ec2.internal,},FirstTimestamp:2026-04-20 21:12:58.97541672 +0000 UTC m=+0.368602259,LastTimestamp:2026-04-20 21:12:58.97541672 +0000 UTC m=+0.368602259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-45.ec2.internal,}" Apr 20 21:12:58.984442 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.984407 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 21:12:58.984511 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.984447 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 21:12:58.984914 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.984656 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7mfjf" Apr 20 21:12:58.984914 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.984661 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 21:12:58.984914 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.984761 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 20 21:12:58.984914 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.984904 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 20 21:12:58.985155 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:58.984959 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:58.985155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.985016 2567 factory.go:153] Registering CRI-O factory Apr 20 21:12:58.985155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.985034 2567 factory.go:223] Registration of the crio container factory successfully Apr 20 21:12:58.985155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.985080 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 21:12:58.985155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.985088 2567 factory.go:55] Registering systemd factory Apr 20 21:12:58.985155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.985108 2567 factory.go:223] Registration of the systemd container factory successfully Apr 20 21:12:58.985155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.985128 2567 factory.go:103] Registering Raw factory Apr 20 21:12:58.985155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.985141 2567 manager.go:1196] Started watching for new ooms in manager Apr 20 21:12:58.985826 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:58.985805 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 21:12:58.985970 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.985956 2567 manager.go:319] Starting recovery of all containers Apr 20 21:12:58.987165 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:58.987137 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-45.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 21:12:58.987276 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:58.987252 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 21:12:58.996307 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:58.996289 2567 manager.go:324] Recovery completed Apr 20 21:12:59.000552 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.000540 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:59.002949 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.002933 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:59.003014 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.002960 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:59.003014 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.002970 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:59.003388 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.003375 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 21:12:59.003388 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.003386 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 21:12:59.003495 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.003400 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:12:59.005398 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.005385 2567 policy_none.go:49] "None policy: Start" Apr 20 21:12:59.005469 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.005400 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 21:12:59.005469 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.005409 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 20 21:12:59.030962 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.030946 2567 manager.go:341] "Starting Device Plugin manager" Apr 20 21:12:59.031034 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.030975 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 21:12:59.031034 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.030983 2567 server.go:85] "Starting device plugin registration server" Apr 20 21:12:59.031204 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.031189 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 21:12:59.031310 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.031205 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 21:12:59.031310 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.031281 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 21:12:59.031463 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.031378 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 21:12:59.031463 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.031388 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 21:12:59.031821 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.031798 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 21:12:59.031894 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.031840 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.120093 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.120061 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 21:12:59.121242 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.121227 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 21:12:59.121302 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.121250 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 21:12:59.121302 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.121265 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 21:12:59.121302 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.121272 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 21:12:59.121453 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.121303 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 21:12:59.124214 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.124190 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:12:59.131918 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.131904 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:59.132621 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.132604 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:59.132677 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.132632 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:59.132677 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.132642 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:59.132677 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.132662 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.140136 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.140099 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.140136 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.140117 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-45.ec2.internal\": node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.161269 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.161245 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.221888 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.221861 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal"] Apr 20 21:12:59.221968 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.221933 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:59.223165 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.223150 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:59.223248 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.223182 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:59.223248 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.223197 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:59.224218 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224203 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:59.224358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224345 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.224402 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224372 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:59.224877 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224849 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:59.224877 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224860 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:59.224877 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224877 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:59.225028 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224889 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:59.225028 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224880 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:59.225028 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.224917 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:59.225849 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.225836 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.225905 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.225859 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:12:59.226448 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.226418 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:12:59.226529 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.226463 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:12:59.226529 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.226476 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:12:59.241926 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.241905 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-45.ec2.internal\" not found" node="ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.246170 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.246156 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-45.ec2.internal\" not found" node="ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.261654 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.261637 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.286474 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.286455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41eb6092f378cd0fab4096d88a265750-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal\" (UID: \"41eb6092f378cd0fab4096d88a265750\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.286531 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.286477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8209567a38135e157ae32e066a471fe4-config\") pod \"kube-apiserver-proxy-ip-10-0-132-45.ec2.internal\" (UID: \"8209567a38135e157ae32e066a471fe4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.286531 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.286495 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/41eb6092f378cd0fab4096d88a265750-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal\" (UID: \"41eb6092f378cd0fab4096d88a265750\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.362381 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.362352 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.386667 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.386643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/41eb6092f378cd0fab4096d88a265750-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal\" (UID: \"41eb6092f378cd0fab4096d88a265750\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.386758 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.386677 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41eb6092f378cd0fab4096d88a265750-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal\" (UID: \"41eb6092f378cd0fab4096d88a265750\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.386758 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.386703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8209567a38135e157ae32e066a471fe4-config\") pod \"kube-apiserver-proxy-ip-10-0-132-45.ec2.internal\" (UID: \"8209567a38135e157ae32e066a471fe4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.386847 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.386760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8209567a38135e157ae32e066a471fe4-config\") pod \"kube-apiserver-proxy-ip-10-0-132-45.ec2.internal\" (UID: \"8209567a38135e157ae32e066a471fe4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.386847 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.386764 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/41eb6092f378cd0fab4096d88a265750-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal\" (UID: \"41eb6092f378cd0fab4096d88a265750\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.386847 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.386766 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41eb6092f378cd0fab4096d88a265750-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal\" (UID: \"41eb6092f378cd0fab4096d88a265750\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.462924 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.462880 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.543559 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.543538 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.549004 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.548981 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" Apr 20 21:12:59.563671 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.563639 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.664214 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.664180 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.764758 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.764707 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.865370 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.865346 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.905816 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.905795 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 21:12:59.906209 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.905940 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:12:59.966206 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:12:59.966186 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:12:59.984297 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.984278 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 21:12:59.987204 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.987176 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 21:07:58 +0000 UTC" deadline="2028-01-27 02:31:11.800112626 +0000 UTC" Apr 20 21:12:59.987204 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.987198 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15509h18m11.812917811s" Apr 20 21:12:59.997500 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:12:59.997481 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:13:00.023018 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.022977 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-424nb" Apr 20 21:13:00.031996 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.031980 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-424nb" Apr 20 21:13:00.062090 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:00.062065 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8209567a38135e157ae32e066a471fe4.slice/crio-1926f5cad7ba293ed133b2ed75d25036908e115d0f1ecfdd43a66a28e99a8682 WatchSource:0}: Error finding container 1926f5cad7ba293ed133b2ed75d25036908e115d0f1ecfdd43a66a28e99a8682: Status 404 returned error can't find the container with id 1926f5cad7ba293ed133b2ed75d25036908e115d0f1ecfdd43a66a28e99a8682 Apr 20 21:13:00.062553 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:00.062530 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41eb6092f378cd0fab4096d88a265750.slice/crio-24f8f19125ae50d89ebc4f87097ed0bdaefda136b30b51b590718bf8c25b126f WatchSource:0}: Error finding container 24f8f19125ae50d89ebc4f87097ed0bdaefda136b30b51b590718bf8c25b126f: Status 404 returned error can't find the container with id 24f8f19125ae50d89ebc4f87097ed0bdaefda136b30b51b590718bf8c25b126f Apr 20 21:13:00.066025 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.066011 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:13:00.066241 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:00.066227 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:13:00.123451 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.123395 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" event={"ID":"41eb6092f378cd0fab4096d88a265750","Type":"ContainerStarted","Data":"24f8f19125ae50d89ebc4f87097ed0bdaefda136b30b51b590718bf8c25b126f"} Apr 20 21:13:00.124268 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.124250 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" event={"ID":"8209567a38135e157ae32e066a471fe4","Type":"ContainerStarted","Data":"1926f5cad7ba293ed133b2ed75d25036908e115d0f1ecfdd43a66a28e99a8682"} Apr 20 21:13:00.166413 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:00.166394 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:13:00.255530 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.255507 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:00.266694 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:00.266674 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:13:00.367225 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:00.367202 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:13:00.373684 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.373668 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:00.468299 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:00.468279 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-45.ec2.internal\" not found" Apr 20 21:13:00.565539 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.565520 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:00.585095 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.584957 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" Apr 20 21:13:00.598176 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.598150 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:13:00.599050 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.599020 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" Apr 20 21:13:00.607333 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.607312 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:13:00.965279 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.965253 2567 apiserver.go:52] "Watching apiserver" Apr 20 21:13:00.973914 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.973817 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 21:13:00.976292 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.976267 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-2m2ck","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm","openshift-dns/node-resolver-c5gzg","openshift-image-registry/node-ca-jk6wx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal","openshift-multus/multus-additional-cni-plugins-78h5h","openshift-multus/multus-fr6d2","openshift-multus/network-metrics-daemon-6p5ds","kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal","openshift-cluster-node-tuning-operator/tuned-8xjrl","openshift-network-diagnostics/network-check-target-7pqmr","openshift-network-operator/iptables-alerter-kf258","openshift-ovn-kubernetes/ovnkube-node-5qmcs"] Apr 20 21:13:00.980758 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.980733 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.982682 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.982658 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:00.982790 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.982737 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:00.983720 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.983616 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 21:13:00.983720 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.983632 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jfbzw\"" Apr 20 21:13:00.983720 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.983701 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 21:13:00.983720 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.983707 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 21:13:00.983989 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.983815 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 21:13:00.985507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.984889 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:00.985507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.984982 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 21:13:00.985507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.984998 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 21:13:00.985507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.985027 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 21:13:00.985507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.984984 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-72zks\"" Apr 20 21:13:00.985507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.985147 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s7f2k\"" Apr 20 21:13:00.985507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.985221 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 21:13:00.985507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.985254 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 21:13:00.986972 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.986953 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.987307 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.987287 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 21:13:00.987670 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.987573 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7wb2n\"" Apr 20 21:13:00.987670 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.987593 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 21:13:00.987670 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.987628 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 21:13:00.989144 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.989125 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 21:13:00.989241 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.989158 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 21:13:00.989241 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.989171 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wnhr7\"" Apr 20 21:13:00.991062 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.991042 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:00.991171 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.991148 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:00.991233 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:00.991209 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:00.993178 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.993150 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 21:13:00.993274 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.993200 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:00.993627 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.993610 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 21:13:00.993793 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.993772 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s9cl2\"" Apr 20 21:13:00.995748 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.995728 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 21:13:00.995840 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.995771 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k9nhc\"" Apr 20 21:13:00.995840 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.995780 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:13:00.996514 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-cni-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.996605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996513 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:00.996605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-cnibin\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.996605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996545 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-conf-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.996605 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:00.996568 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:00.996605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996585 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-etc-kubernetes\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.996861 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hx2j\" (UniqueName: \"kubernetes.io/projected/2a4eea4f-8eef-446c-a518-bcb5b140ee35-kube-api-access-7hx2j\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:00.996861 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-multus-certs\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.996861 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996667 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-device-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:00.996861 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996693 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e05ebd33-1ace-4f1d-8379-0925d5e79b13-tmp-dir\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:00.996861 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996732 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a4eea4f-8eef-446c-a518-bcb5b140ee35-host\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:00.996861 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996759 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-k8s-cni-cncf-io\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.996861 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-netns\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.996861 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996820 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplpn\" (UniqueName: \"kubernetes.io/projected/27234e93-fd7f-443e-8c3a-28ab70606c45-kube-api-access-wplpn\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996866 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-etc-selinux\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-os-release\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996958 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cni-binary-copy\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.996982 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmfxx\" (UniqueName: \"kubernetes.io/projected/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-kube-api-access-gmfxx\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997034 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27234e93-fd7f-443e-8c3a-28ab70606c45-cni-binary-copy\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997059 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-socket-dir-parent\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997082 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-kubelet\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997103 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-hostroot\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997126 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmkm9\" (UniqueName: \"kubernetes.io/projected/d4d4427b-ceca-4928-90cc-5ebacc067735-kube-api-access-wmkm9\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997149 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cnibin\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.997174 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997170 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-os-release\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997192 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e05ebd33-1ace-4f1d-8379-0925d5e79b13-hosts-file\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997215 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997240 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997261 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-socket-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997337 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-sys-fs\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hz7r\" (UniqueName: \"kubernetes.io/projected/e05ebd33-1ace-4f1d-8379-0925d5e79b13-kube-api-access-5hz7r\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997400 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2a4eea4f-8eef-446c-a518-bcb5b140ee35-serviceca\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997439 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997467 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-daemon-config\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997508 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-registration-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997542 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-system-cni-dir\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997578 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-system-cni-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997598 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-cni-bin\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.997691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.997623 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-cni-multus\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:00.998966 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:00.998945 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.001554 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.001531 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 21:13:01.001554 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.001545 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 21:13:01.001695 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.001594 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:13:01.001804 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.001781 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.001908 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.001875 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-z6z28\"" Apr 20 21:13:01.004237 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.004167 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 21:13:01.004700 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.004507 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 21:13:01.004700 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.004606 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 21:13:01.004848 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.004741 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 21:13:01.004903 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.004878 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 21:13:01.004998 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.004970 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 21:13:01.005186 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.005170 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-p7v8b\"" Apr 20 21:13:01.032637 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.032610 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:08:00 +0000 UTC" deadline="2027-09-26 02:08:10.447993755 +0000 UTC" Apr 20 21:13:01.032637 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.032636 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12556h55m9.415361004s" Apr 20 21:13:01.085628 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.085611 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 21:13:01.098193 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098171 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a4eea4f-8eef-446c-a518-bcb5b140ee35-host\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:01.098282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098201 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-sys\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.098282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098221 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-lib-modules\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.098282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098246 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b73ab204-677f-45ce-8d9d-26e042fd308c-ovn-node-metrics-cert\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.098282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098273 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-netns\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.098282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098278 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a4eea4f-8eef-446c-a518-bcb5b140ee35-host\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:01.098556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098315 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplpn\" (UniqueName: \"kubernetes.io/projected/27234e93-fd7f-443e-8c3a-28ab70606c45-kube-api-access-wplpn\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.098556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098349 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-netns\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.098556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098388 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-etc-selinux\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.098556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098451 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmfxx\" (UniqueName: \"kubernetes.io/projected/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-kube-api-access-gmfxx\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.098556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098491 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/790f57a1-725b-4f47-abb1-5623730655e9-konnectivity-ca\") pod \"konnectivity-agent-2m2ck\" (UID: \"790f57a1-725b-4f47-abb1-5623730655e9\") " pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:01.098556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098510 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-etc-selinux\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.098556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:01.098556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098546 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-log-socket\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx6t4\" (UniqueName: \"kubernetes.io/projected/b73ab204-677f-45ce-8d9d-26e042fd308c-kube-api-access-jx6t4\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27234e93-fd7f-443e-8c3a-28ab70606c45-cni-binary-copy\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098623 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-hostroot\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098647 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmkm9\" (UniqueName: \"kubernetes.io/projected/d4d4427b-ceca-4928-90cc-5ebacc067735-kube-api-access-wmkm9\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098670 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-hostroot\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3b66e697-8367-48ee-b139-bb6c98743b29-iptables-alerter-script\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098716 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-kubelet\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-cni-netd\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098755 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-ovnkube-config\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098782 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098810 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-modprobe-d\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098835 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.098922 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098884 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-ovnkube-script-lib\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098929 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-socket-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.098966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-sys-fs\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099027 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-sys-fs\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099041 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hz7r\" (UniqueName: \"kubernetes.io/projected/e05ebd33-1ace-4f1d-8379-0925d5e79b13-kube-api-access-5hz7r\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099071 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099098 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-tuned\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099106 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-socket-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099122 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b66e697-8367-48ee-b139-bb6c98743b29-host-slash\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099164 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-node-log\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099187 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-cni-bin\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099193 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27234e93-fd7f-443e-8c3a-28ab70606c45-cni-binary-copy\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099214 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-system-cni-dir\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099234 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099255 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl6ls\" (UniqueName: \"kubernetes.io/projected/3b66e697-8367-48ee-b139-bb6c98743b29-kube-api-access-sl6ls\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099285 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-run-netns\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099291 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-system-cni-dir\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.099586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-system-cni-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099337 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099364 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-cni-multus\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099380 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-system-cni-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099407 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-cni-multus\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099396 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-systemd\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/790f57a1-725b-4f47-abb1-5623730655e9-agent-certs\") pod \"konnectivity-agent-2m2ck\" (UID: \"790f57a1-725b-4f47-abb1-5623730655e9\") " pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099486 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-slash\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-cni-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099535 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-cnibin\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-etc-kubernetes\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysctl-conf\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099624 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-ovn\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-cni-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-multus-certs\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099674 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-device-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-etc-kubernetes\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e05ebd33-1ace-4f1d-8379-0925d5e79b13-tmp-dir\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:01.100358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099721 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-run\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099748 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-cnibin\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099765 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-k8s-cni-cncf-io\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099783 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-multus-certs\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-os-release\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cni-binary-copy\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-device-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-run-k8s-cni-cncf-io\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099902 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-os-release\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.099985 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-kubernetes\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100012 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysctl-d\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100033 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-host\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxf98\" (UniqueName: \"kubernetes.io/projected/ea587890-27b1-40ff-bdfc-67d94b889d89-kube-api-access-gxf98\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100102 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e05ebd33-1ace-4f1d-8379-0925d5e79b13-tmp-dir\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-socket-dir-parent\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100237 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-kubelet\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100260 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-socket-dir-parent\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100263 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cnibin\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-kubelet\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100300 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-os-release\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100324 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e05ebd33-1ace-4f1d-8379-0925d5e79b13-hosts-file\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100331 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cnibin\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100350 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-var-lib-kubelet\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100367 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-cni-binary-copy\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100375 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e05ebd33-1ace-4f1d-8379-0925d5e79b13-hosts-file\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100376 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea587890-27b1-40ff-bdfc-67d94b889d89-tmp\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100409 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-os-release\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100456 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-env-overrides\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100466 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100574 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2a4eea4f-8eef-446c-a518-bcb5b140ee35-serviceca\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100598 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysconfig\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.101939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100618 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-systemd-units\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-daemon-config\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100662 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-registration-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100641 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100678 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4x2p\" (UniqueName: \"kubernetes.io/projected/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-kube-api-access-t4x2p\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-cni-bin\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100745 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d4d4427b-ceca-4928-90cc-5ebacc067735-registration-dir\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100786 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-etc-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100795 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-host-var-lib-cni-bin\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100836 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-conf-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hx2j\" (UniqueName: \"kubernetes.io/projected/2a4eea4f-8eef-446c-a518-bcb5b140ee35-kube-api-access-7hx2j\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100879 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-conf-dir\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100896 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100923 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-systemd\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100948 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-var-lib-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.100975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-run-ovn-kubernetes\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.101021 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2a4eea4f-8eef-446c-a518-bcb5b140ee35-serviceca\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:01.102730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.101128 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27234e93-fd7f-443e-8c3a-28ab70606c45-multus-daemon-config\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.104332 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.104303 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 21:13:01.108002 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.107979 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hz7r\" (UniqueName: \"kubernetes.io/projected/e05ebd33-1ace-4f1d-8379-0925d5e79b13-kube-api-access-5hz7r\") pod \"node-resolver-c5gzg\" (UID: \"e05ebd33-1ace-4f1d-8379-0925d5e79b13\") " pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:01.108002 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.107996 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wplpn\" (UniqueName: \"kubernetes.io/projected/27234e93-fd7f-443e-8c3a-28ab70606c45-kube-api-access-wplpn\") pod \"multus-fr6d2\" (UID: \"27234e93-fd7f-443e-8c3a-28ab70606c45\") " pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.108146 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.108001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmfxx\" (UniqueName: \"kubernetes.io/projected/b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0-kube-api-access-gmfxx\") pod \"multus-additional-cni-plugins-78h5h\" (UID: \"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0\") " pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.108146 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.107986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmkm9\" (UniqueName: \"kubernetes.io/projected/d4d4427b-ceca-4928-90cc-5ebacc067735-kube-api-access-wmkm9\") pod \"aws-ebs-csi-driver-node-ftpsm\" (UID: \"d4d4427b-ceca-4928-90cc-5ebacc067735\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.108493 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.108439 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hx2j\" (UniqueName: \"kubernetes.io/projected/2a4eea4f-8eef-446c-a518-bcb5b140ee35-kube-api-access-7hx2j\") pod \"node-ca-jk6wx\" (UID: \"2a4eea4f-8eef-446c-a518-bcb5b140ee35\") " pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:01.201696 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx6t4\" (UniqueName: \"kubernetes.io/projected/b73ab204-677f-45ce-8d9d-26e042fd308c-kube-api-access-jx6t4\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.201696 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3b66e697-8367-48ee-b139-bb6c98743b29-iptables-alerter-script\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.201897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201720 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-kubelet\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.201897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-cni-netd\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.201897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201759 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-ovnkube-config\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.201897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201817 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-kubelet\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.201897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201824 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-cni-netd\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.201897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201852 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-modprobe-d\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.201897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-ovnkube-script-lib\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201943 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-modprobe-d\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.201950 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-tuned\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b66e697-8367-48ee-b139-bb6c98743b29-host-slash\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202027 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-node-log\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-cni-bin\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202076 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl6ls\" (UniqueName: \"kubernetes.io/projected/3b66e697-8367-48ee-b139-bb6c98743b29-kube-api-access-sl6ls\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202081 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b66e697-8367-48ee-b139-bb6c98743b29-host-slash\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202101 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-node-log\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-run-netns\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202129 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-cni-bin\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-run-netns\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-systemd\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/790f57a1-725b-4f47-abb1-5623730655e9-agent-certs\") pod \"konnectivity-agent-2m2ck\" (UID: \"790f57a1-725b-4f47-abb1-5623730655e9\") " pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:01.202192 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202201 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-slash\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysctl-conf\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202250 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-ovn\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-run\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202306 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-kubernetes\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202328 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysctl-d\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-host\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxf98\" (UniqueName: \"kubernetes.io/projected/ea587890-27b1-40ff-bdfc-67d94b889d89-kube-api-access-gxf98\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-systemd\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202480 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-var-lib-kubelet\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202494 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-run\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202519 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-ovn\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202545 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-kubernetes\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-ovnkube-config\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3b66e697-8367-48ee-b139-bb6c98743b29-iptables-alerter-script\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202613 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-ovnkube-script-lib\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202671 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-host\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202671 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysctl-d\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.202933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202409 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-var-lib-kubelet\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202728 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea587890-27b1-40ff-bdfc-67d94b889d89-tmp\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-slash\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202785 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysctl-conf\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202794 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-env-overrides\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202827 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysconfig\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202853 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-systemd-units\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202858 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4x2p\" (UniqueName: \"kubernetes.io/projected/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-kube-api-access-t4x2p\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-etc-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202944 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-systemd-units\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-systemd\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202989 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-etc-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-var-lib-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203019 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-run-ovn-kubernetes\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.203786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-sys\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203070 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-lib-modules\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b73ab204-677f-45ce-8d9d-26e042fd308c-ovn-node-metrics-cert\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.203117 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/790f57a1-725b-4f47-abb1-5623730655e9-konnectivity-ca\") pod \"konnectivity-agent-2m2ck\" (UID: \"790f57a1-725b-4f47-abb1-5623730655e9\") " pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-host-run-ovn-kubernetes\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203166 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203220 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-log-socket\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203237 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-sys\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.203299 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs podName:ec3d4534-0f04-46f4-8eae-d37ac21ac0c6 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:01.703255438 +0000 UTC m=+3.096440978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs") pod "network-metrics-daemon-6p5ds" (UID: "ec3d4534-0f04-46f4-8eae-d37ac21ac0c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.202914 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-sysconfig\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203117 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b73ab204-677f-45ce-8d9d-26e042fd308c-env-overrides\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203363 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea587890-27b1-40ff-bdfc-67d94b889d89-lib-modules\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203409 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-log-socket\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-var-lib-openvswitch\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.203220 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b73ab204-677f-45ce-8d9d-26e042fd308c-run-systemd\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.204475 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/790f57a1-725b-4f47-abb1-5623730655e9-konnectivity-ca\") pod \"konnectivity-agent-2m2ck\" (UID: \"790f57a1-725b-4f47-abb1-5623730655e9\") " pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:01.204615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.204599 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea587890-27b1-40ff-bdfc-67d94b889d89-etc-tuned\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.205331 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.205096 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/790f57a1-725b-4f47-abb1-5623730655e9-agent-certs\") pod \"konnectivity-agent-2m2ck\" (UID: \"790f57a1-725b-4f47-abb1-5623730655e9\") " pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:01.205383 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.205337 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b73ab204-677f-45ce-8d9d-26e042fd308c-ovn-node-metrics-cert\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.205486 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.205466 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea587890-27b1-40ff-bdfc-67d94b889d89-tmp\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.209661 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.209388 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:01.209661 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.209412 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:01.209661 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.209442 2567 projected.go:194] Error preparing data for projected volume kube-api-access-kdck5 for pod openshift-network-diagnostics/network-check-target-7pqmr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:01.209661 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.209509 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5 podName:35b8e8ff-14c6-4807-bd77-b37eaea1544c nodeName:}" failed. No retries permitted until 2026-04-20 21:13:01.709481406 +0000 UTC m=+3.102666941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kdck5" (UniqueName: "kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5") pod "network-check-target-7pqmr" (UID: "35b8e8ff-14c6-4807-bd77-b37eaea1544c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:01.210104 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.210083 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl6ls\" (UniqueName: \"kubernetes.io/projected/3b66e697-8367-48ee-b139-bb6c98743b29-kube-api-access-sl6ls\") pod \"iptables-alerter-kf258\" (UID: \"3b66e697-8367-48ee-b139-bb6c98743b29\") " pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.211018 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.210994 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxf98\" (UniqueName: \"kubernetes.io/projected/ea587890-27b1-40ff-bdfc-67d94b889d89-kube-api-access-gxf98\") pod \"tuned-8xjrl\" (UID: \"ea587890-27b1-40ff-bdfc-67d94b889d89\") " pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.211124 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.211063 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:01.211389 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.211356 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx6t4\" (UniqueName: \"kubernetes.io/projected/b73ab204-677f-45ce-8d9d-26e042fd308c-kube-api-access-jx6t4\") pod \"ovnkube-node-5qmcs\" (UID: \"b73ab204-677f-45ce-8d9d-26e042fd308c\") " pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.211839 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.211822 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4x2p\" (UniqueName: \"kubernetes.io/projected/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-kube-api-access-t4x2p\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:01.292151 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.292121 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fr6d2" Apr 20 21:13:01.298787 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.298763 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" Apr 20 21:13:01.307434 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.307406 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c5gzg" Apr 20 21:13:01.312962 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.312947 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jk6wx" Apr 20 21:13:01.319447 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.319416 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-78h5h" Apr 20 21:13:01.325969 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.325954 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:01.332514 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.332487 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" Apr 20 21:13:01.340002 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.339987 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kf258" Apr 20 21:13:01.344561 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.344544 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:01.706832 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.706795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:01.707074 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.707051 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:01.707169 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.707135 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs podName:ec3d4534-0f04-46f4-8eae-d37ac21ac0c6 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:02.707111184 +0000 UTC m=+4.100296716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs") pod "network-metrics-daemon-6p5ds" (UID: "ec3d4534-0f04-46f4-8eae-d37ac21ac0c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:01.718549 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.718529 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a4eea4f_8eef_446c_a518_bcb5b140ee35.slice/crio-a0a3856bff0e05c41d1c014a260372a07dd8d1dee08406ac0a710600049c216f WatchSource:0}: Error finding container a0a3856bff0e05c41d1c014a260372a07dd8d1dee08406ac0a710600049c216f: Status 404 returned error can't find the container with id a0a3856bff0e05c41d1c014a260372a07dd8d1dee08406ac0a710600049c216f Apr 20 21:13:01.719380 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.719358 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea587890_27b1_40ff_bdfc_67d94b889d89.slice/crio-572dbc545dcac0a4535c17773c810153783170412f05a75224e1f0937b5be237 WatchSource:0}: Error finding container 572dbc545dcac0a4535c17773c810153783170412f05a75224e1f0937b5be237: Status 404 returned error can't find the container with id 572dbc545dcac0a4535c17773c810153783170412f05a75224e1f0937b5be237 Apr 20 21:13:01.720793 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.720767 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d4427b_ceca_4928_90cc_5ebacc067735.slice/crio-1f45f7012f4dda102a37842e06580964968329ec380470077055917346f4fe86 WatchSource:0}: Error finding container 1f45f7012f4dda102a37842e06580964968329ec380470077055917346f4fe86: Status 404 returned error can't find the container with id 1f45f7012f4dda102a37842e06580964968329ec380470077055917346f4fe86 Apr 20 21:13:01.721733 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.721682 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb73ab204_677f_45ce_8d9d_26e042fd308c.slice/crio-bc999cc6ce01af6c68ad30ad3a6934d3c65db5dc48564ffaca655e3ed31afa5a WatchSource:0}: Error finding container bc999cc6ce01af6c68ad30ad3a6934d3c65db5dc48564ffaca655e3ed31afa5a: Status 404 returned error can't find the container with id bc999cc6ce01af6c68ad30ad3a6934d3c65db5dc48564ffaca655e3ed31afa5a Apr 20 21:13:01.724571 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.724547 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27234e93_fd7f_443e_8c3a_28ab70606c45.slice/crio-388486e9924bc431266d95348f5abaad16452c28036f3ddb23945ba268b40baa WatchSource:0}: Error finding container 388486e9924bc431266d95348f5abaad16452c28036f3ddb23945ba268b40baa: Status 404 returned error can't find the container with id 388486e9924bc431266d95348f5abaad16452c28036f3ddb23945ba268b40baa Apr 20 21:13:01.725763 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.725742 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8e3dae1_8a79_4f24_a9f1_2cf6da0045a0.slice/crio-ad1d426ca992028011060abada4746d81462a6ea71c9b037c6b1e95cb5bdd470 WatchSource:0}: Error finding container ad1d426ca992028011060abada4746d81462a6ea71c9b037c6b1e95cb5bdd470: Status 404 returned error can't find the container with id ad1d426ca992028011060abada4746d81462a6ea71c9b037c6b1e95cb5bdd470 Apr 20 21:13:01.726680 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.726658 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b66e697_8367_48ee_b139_bb6c98743b29.slice/crio-6dab2bafe516626f450d4a2fdcb2cf24a1f67101bfcbc476502e10c689f5f73f WatchSource:0}: Error finding container 6dab2bafe516626f450d4a2fdcb2cf24a1f67101bfcbc476502e10c689f5f73f: Status 404 returned error can't find the container with id 6dab2bafe516626f450d4a2fdcb2cf24a1f67101bfcbc476502e10c689f5f73f Apr 20 21:13:01.730849 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.728221 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode05ebd33_1ace_4f1d_8379_0925d5e79b13.slice/crio-5b5c0a7868eea44b6a484ed46a6761a06a6488f5d623bab2459ff884a0ffc4dc WatchSource:0}: Error finding container 5b5c0a7868eea44b6a484ed46a6761a06a6488f5d623bab2459ff884a0ffc4dc: Status 404 returned error can't find the container with id 5b5c0a7868eea44b6a484ed46a6761a06a6488f5d623bab2459ff884a0ffc4dc Apr 20 21:13:01.730849 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:01.729189 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790f57a1_725b_4f47_abb1_5623730655e9.slice/crio-cd72810790443eb67ae2287c00605ee49035f339116d97a377e18e930e2539b4 WatchSource:0}: Error finding container cd72810790443eb67ae2287c00605ee49035f339116d97a377e18e930e2539b4: Status 404 returned error can't find the container with id cd72810790443eb67ae2287c00605ee49035f339116d97a377e18e930e2539b4 Apr 20 21:13:01.808228 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:01.808117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:01.808300 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.808260 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:01.808300 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.808281 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:01.808300 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.808290 2567 projected.go:194] Error preparing data for projected volume kube-api-access-kdck5 for pod openshift-network-diagnostics/network-check-target-7pqmr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:01.808450 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:01.808339 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5 podName:35b8e8ff-14c6-4807-bd77-b37eaea1544c nodeName:}" failed. No retries permitted until 2026-04-20 21:13:02.808320973 +0000 UTC m=+4.201506491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdck5" (UniqueName: "kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5") pod "network-check-target-7pqmr" (UID: "35b8e8ff-14c6-4807-bd77-b37eaea1544c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:02.033266 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.033235 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:08:00 +0000 UTC" deadline="2027-11-01 10:22:50.765620327 +0000 UTC" Apr 20 21:13:02.033266 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.033261 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13429h9m48.73236172s" Apr 20 21:13:02.131783 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.131747 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" event={"ID":"8209567a38135e157ae32e066a471fe4","Type":"ContainerStarted","Data":"2200a7d16a6e700ae5fb6a4c02476306f21e9f57059c88eabbc355d7fcc32277"} Apr 20 21:13:02.134710 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.134661 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kf258" event={"ID":"3b66e697-8367-48ee-b139-bb6c98743b29","Type":"ContainerStarted","Data":"6dab2bafe516626f450d4a2fdcb2cf24a1f67101bfcbc476502e10c689f5f73f"} Apr 20 21:13:02.136617 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.136591 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" event={"ID":"d4d4427b-ceca-4928-90cc-5ebacc067735","Type":"ContainerStarted","Data":"1f45f7012f4dda102a37842e06580964968329ec380470077055917346f4fe86"} Apr 20 21:13:02.139409 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.139364 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" event={"ID":"ea587890-27b1-40ff-bdfc-67d94b889d89","Type":"ContainerStarted","Data":"572dbc545dcac0a4535c17773c810153783170412f05a75224e1f0937b5be237"} Apr 20 21:13:02.141643 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.141619 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jk6wx" event={"ID":"2a4eea4f-8eef-446c-a518-bcb5b140ee35","Type":"ContainerStarted","Data":"a0a3856bff0e05c41d1c014a260372a07dd8d1dee08406ac0a710600049c216f"} Apr 20 21:13:02.147334 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.146815 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-45.ec2.internal" podStartSLOduration=2.146800861 podStartE2EDuration="2.146800861s" podCreationTimestamp="2026-04-20 21:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:13:02.146157456 +0000 UTC m=+3.539342998" watchObservedRunningTime="2026-04-20 21:13:02.146800861 +0000 UTC m=+3.539986403" Apr 20 21:13:02.148942 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.148918 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2m2ck" event={"ID":"790f57a1-725b-4f47-abb1-5623730655e9","Type":"ContainerStarted","Data":"cd72810790443eb67ae2287c00605ee49035f339116d97a377e18e930e2539b4"} Apr 20 21:13:02.152841 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.152820 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c5gzg" event={"ID":"e05ebd33-1ace-4f1d-8379-0925d5e79b13","Type":"ContainerStarted","Data":"5b5c0a7868eea44b6a484ed46a6761a06a6488f5d623bab2459ff884a0ffc4dc"} Apr 20 21:13:02.155073 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.155048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78h5h" event={"ID":"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0","Type":"ContainerStarted","Data":"ad1d426ca992028011060abada4746d81462a6ea71c9b037c6b1e95cb5bdd470"} Apr 20 21:13:02.164523 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.164497 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr6d2" event={"ID":"27234e93-fd7f-443e-8c3a-28ab70606c45","Type":"ContainerStarted","Data":"388486e9924bc431266d95348f5abaad16452c28036f3ddb23945ba268b40baa"} Apr 20 21:13:02.168139 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.168076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"bc999cc6ce01af6c68ad30ad3a6934d3c65db5dc48564ffaca655e3ed31afa5a"} Apr 20 21:13:02.717171 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.716547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:02.717171 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:02.716706 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:02.717171 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:02.716765 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs podName:ec3d4534-0f04-46f4-8eae-d37ac21ac0c6 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:04.71674735 +0000 UTC m=+6.109932873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs") pod "network-metrics-daemon-6p5ds" (UID: "ec3d4534-0f04-46f4-8eae-d37ac21ac0c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:02.818003 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:02.817415 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:02.818003 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:02.817596 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:02.818003 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:02.817613 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:02.818003 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:02.817625 2567 projected.go:194] Error preparing data for projected volume kube-api-access-kdck5 for pod openshift-network-diagnostics/network-check-target-7pqmr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:02.818003 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:02.817676 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5 podName:35b8e8ff-14c6-4807-bd77-b37eaea1544c nodeName:}" failed. No retries permitted until 2026-04-20 21:13:04.817658776 +0000 UTC m=+6.210844300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdck5" (UniqueName: "kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5") pod "network-check-target-7pqmr" (UID: "35b8e8ff-14c6-4807-bd77-b37eaea1544c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:03.122645 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:03.122143 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:03.122645 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:03.122277 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:03.122645 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:03.122489 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:03.122645 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:03.122589 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:03.179687 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:03.179657 2567 generic.go:358] "Generic (PLEG): container finished" podID="41eb6092f378cd0fab4096d88a265750" containerID="d75874e81faed05ddabbb7a69675ae21865df5b0b0d0cf875d4c283ce3a4497d" exitCode=0 Apr 20 21:13:03.180436 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:03.180395 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" event={"ID":"41eb6092f378cd0fab4096d88a265750","Type":"ContainerDied","Data":"d75874e81faed05ddabbb7a69675ae21865df5b0b0d0cf875d4c283ce3a4497d"} Apr 20 21:13:04.187474 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:04.187238 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" event={"ID":"41eb6092f378cd0fab4096d88a265750","Type":"ContainerStarted","Data":"576a3376d7bddfa234474c5dafd272cbc5b70d913b150de6d99a50fec3c8c3fc"} Apr 20 21:13:04.735011 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:04.734977 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:04.735179 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:04.735122 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:04.735237 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:04.735181 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs podName:ec3d4534-0f04-46f4-8eae-d37ac21ac0c6 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:08.735163207 +0000 UTC m=+10.128348725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs") pod "network-metrics-daemon-6p5ds" (UID: "ec3d4534-0f04-46f4-8eae-d37ac21ac0c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:04.836432 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:04.835581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:04.836432 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:04.835732 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:04.836432 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:04.835750 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:04.836432 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:04.835761 2567 projected.go:194] Error preparing data for projected volume kube-api-access-kdck5 for pod openshift-network-diagnostics/network-check-target-7pqmr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:04.836432 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:04.835816 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5 podName:35b8e8ff-14c6-4807-bd77-b37eaea1544c nodeName:}" failed. No retries permitted until 2026-04-20 21:13:08.835797872 +0000 UTC m=+10.228983395 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdck5" (UniqueName: "kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5") pod "network-check-target-7pqmr" (UID: "35b8e8ff-14c6-4807-bd77-b37eaea1544c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:05.122392 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:05.122361 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:05.122582 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:05.122498 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:05.122841 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:05.122361 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:05.122963 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:05.122939 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:07.122112 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:07.122079 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:07.122573 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:07.122216 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:07.122707 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:07.122687 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:07.122808 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:07.122787 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:08.763368 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:08.763328 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:08.763902 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:08.763527 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:08.763902 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:08.763591 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs podName:ec3d4534-0f04-46f4-8eae-d37ac21ac0c6 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:16.763573211 +0000 UTC m=+18.156758735 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs") pod "network-metrics-daemon-6p5ds" (UID: "ec3d4534-0f04-46f4-8eae-d37ac21ac0c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:08.864159 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:08.864117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:08.864333 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:08.864302 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:08.864333 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:08.864323 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:08.864333 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:08.864335 2567 projected.go:194] Error preparing data for projected volume kube-api-access-kdck5 for pod openshift-network-diagnostics/network-check-target-7pqmr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:08.864527 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:08.864397 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5 podName:35b8e8ff-14c6-4807-bd77-b37eaea1544c nodeName:}" failed. No retries permitted until 2026-04-20 21:13:16.864376512 +0000 UTC m=+18.257562054 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdck5" (UniqueName: "kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5") pod "network-check-target-7pqmr" (UID: "35b8e8ff-14c6-4807-bd77-b37eaea1544c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:09.123085 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:09.122382 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:09.123085 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:09.122516 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:09.123085 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:09.122939 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:09.123085 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:09.123038 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:10.711945 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.711891 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-45.ec2.internal" podStartSLOduration=10.711872436 podStartE2EDuration="10.711872436s" podCreationTimestamp="2026-04-20 21:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:13:04.202801143 +0000 UTC m=+5.595986685" watchObservedRunningTime="2026-04-20 21:13:10.711872436 +0000 UTC m=+12.105057978" Apr 20 21:13:10.712732 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.712709 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rjql2"] Apr 20 21:13:10.718400 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.718372 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.718510 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:10.718480 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:10.781710 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.781678 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.781827 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.781736 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-dbus\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.781827 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.781798 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-kubelet-config\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.882726 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.882525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-kubelet-config\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.882726 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.882613 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.882726 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.882653 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-dbus\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.882933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.882777 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-kubelet-config\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.882933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:10.882802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-dbus\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:10.882933 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:10.882891 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:10.883078 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:10.882957 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret podName:c487441d-7d18-4b60-b94d-d6e7a1fdc1a0 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:11.382938608 +0000 UTC m=+12.776124127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret") pod "global-pull-secret-syncer-rjql2" (UID: "c487441d-7d18-4b60-b94d-d6e7a1fdc1a0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:11.121882 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.121811 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:11.122015 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.121814 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:11.122015 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:11.121944 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:11.122117 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:11.122034 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:11.201303 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.201266 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2m2ck" event={"ID":"790f57a1-725b-4f47-abb1-5623730655e9","Type":"ContainerStarted","Data":"92911b6c88e026c106e0b82aa8200db251f6d6c6630276be3afa7488bb465889"} Apr 20 21:13:11.202789 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.202757 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c5gzg" event={"ID":"e05ebd33-1ace-4f1d-8379-0925d5e79b13","Type":"ContainerStarted","Data":"bf12c9c76d1ad064310aa0dbef0a6f24f13ca93fe50951c5ba13134f369c0c7c"} Apr 20 21:13:11.204195 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.204167 2567 generic.go:358] "Generic (PLEG): container finished" podID="b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0" containerID="787738b1bc08c24b34d3ea810c12cb202bff346870e0ab26bdea25aae91caab0" exitCode=0 Apr 20 21:13:11.204319 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.204259 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78h5h" event={"ID":"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0","Type":"ContainerDied","Data":"787738b1bc08c24b34d3ea810c12cb202bff346870e0ab26bdea25aae91caab0"} Apr 20 21:13:11.205858 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.205832 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" event={"ID":"d4d4427b-ceca-4928-90cc-5ebacc067735","Type":"ContainerStarted","Data":"82dbd7f02bb844dd0d4c19c7ec997d05513512110cd6aebbf94c431be0fbbb65"} Apr 20 21:13:11.207141 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.207119 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" event={"ID":"ea587890-27b1-40ff-bdfc-67d94b889d89","Type":"ContainerStarted","Data":"09e1b8f506857909837aa2edfa489659d778df197bd3e03672e634a1a2c4bc48"} Apr 20 21:13:11.208454 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.208414 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jk6wx" event={"ID":"2a4eea4f-8eef-446c-a518-bcb5b140ee35","Type":"ContainerStarted","Data":"164f8ecb2041b1f128a70843731b3b4ae5c5ac9860ac62bdd01df29508b4752b"} Apr 20 21:13:11.216976 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.216930 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2m2ck" podStartSLOduration=3.353167664 podStartE2EDuration="12.21691756s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.750727837 +0000 UTC m=+3.143913356" lastFinishedPulling="2026-04-20 21:13:10.614477729 +0000 UTC m=+12.007663252" observedRunningTime="2026-04-20 21:13:11.216279392 +0000 UTC m=+12.609464943" watchObservedRunningTime="2026-04-20 21:13:11.21691756 +0000 UTC m=+12.610103098" Apr 20 21:13:11.230488 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.230452 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c5gzg" podStartSLOduration=3.367183716 podStartE2EDuration="12.230440745s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.75091558 +0000 UTC m=+3.144101104" lastFinishedPulling="2026-04-20 21:13:10.614172611 +0000 UTC m=+12.007358133" observedRunningTime="2026-04-20 21:13:11.230173245 +0000 UTC m=+12.623358799" watchObservedRunningTime="2026-04-20 21:13:11.230440745 +0000 UTC m=+12.623626278" Apr 20 21:13:11.243894 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.243861 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jk6wx" podStartSLOduration=3.386993071 podStartE2EDuration="12.243852443s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.720193528 +0000 UTC m=+3.113379047" lastFinishedPulling="2026-04-20 21:13:10.577052896 +0000 UTC m=+11.970238419" observedRunningTime="2026-04-20 21:13:11.243741619 +0000 UTC m=+12.636927162" watchObservedRunningTime="2026-04-20 21:13:11.243852443 +0000 UTC m=+12.637037983" Apr 20 21:13:11.304668 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.304485 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8xjrl" podStartSLOduration=3.395830021 podStartE2EDuration="12.304475236s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.722222366 +0000 UTC m=+3.115407899" lastFinishedPulling="2026-04-20 21:13:10.630867578 +0000 UTC m=+12.024053114" observedRunningTime="2026-04-20 21:13:11.303955873 +0000 UTC m=+12.697141417" watchObservedRunningTime="2026-04-20 21:13:11.304475236 +0000 UTC m=+12.697660777" Apr 20 21:13:11.387892 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:11.387825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:11.388008 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:11.387919 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:11.388008 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:11.387975 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret podName:c487441d-7d18-4b60-b94d-d6e7a1fdc1a0 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:12.387958712 +0000 UTC m=+13.781144249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret") pod "global-pull-secret-syncer-rjql2" (UID: "c487441d-7d18-4b60-b94d-d6e7a1fdc1a0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:12.122073 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:12.122043 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:12.122467 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:12.122152 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:12.211562 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:12.211511 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kf258" event={"ID":"3b66e697-8367-48ee-b139-bb6c98743b29","Type":"ContainerStarted","Data":"adfed7ab389a52e3fae703abd2c3213eb414e30c42681698aec8050169e62237"} Apr 20 21:13:12.228689 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:12.228640 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kf258" podStartSLOduration=4.366779233 podStartE2EDuration="13.228623627s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.750629844 +0000 UTC m=+3.143815364" lastFinishedPulling="2026-04-20 21:13:10.612474238 +0000 UTC m=+12.005659758" observedRunningTime="2026-04-20 21:13:12.228552808 +0000 UTC m=+13.621738350" watchObservedRunningTime="2026-04-20 21:13:12.228623627 +0000 UTC m=+13.621809168" Apr 20 21:13:12.396717 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:12.396641 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:12.396854 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:12.396763 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:12.396854 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:12.396846 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret podName:c487441d-7d18-4b60-b94d-d6e7a1fdc1a0 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:14.396825701 +0000 UTC m=+15.790011245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret") pod "global-pull-secret-syncer-rjql2" (UID: "c487441d-7d18-4b60-b94d-d6e7a1fdc1a0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:13.126252 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:13.124712 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:13.126252 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:13.124870 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:13.126252 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:13.125384 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:13.126252 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:13.125546 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:14.122249 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:14.122220 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:14.122464 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:14.122338 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:14.412073 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:14.411996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:14.412627 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:14.412122 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:14.412627 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:14.412180 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret podName:c487441d-7d18-4b60-b94d-d6e7a1fdc1a0 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:18.412164911 +0000 UTC m=+19.805350431 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret") pod "global-pull-secret-syncer-rjql2" (UID: "c487441d-7d18-4b60-b94d-d6e7a1fdc1a0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:15.068883 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:15.068827 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:15.069379 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:15.069353 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:15.121624 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:15.121597 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:15.121765 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:15.121626 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:15.121765 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:15.121709 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:15.121892 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:15.121864 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:15.215845 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:15.215817 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:15.216222 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:15.216204 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2m2ck" Apr 20 21:13:16.122053 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:16.122020 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:16.122655 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:16.122148 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:16.830694 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:16.830656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:16.830866 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:16.830822 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:16.830935 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:16.830892 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs podName:ec3d4534-0f04-46f4-8eae-d37ac21ac0c6 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:32.830872448 +0000 UTC m=+34.224057970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs") pod "network-metrics-daemon-6p5ds" (UID: "ec3d4534-0f04-46f4-8eae-d37ac21ac0c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:16.931455 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:16.931412 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:16.931618 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:16.931580 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:16.931618 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:16.931603 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:16.931618 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:16.931617 2567 projected.go:194] Error preparing data for projected volume kube-api-access-kdck5 for pod openshift-network-diagnostics/network-check-target-7pqmr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:16.931799 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:16.931681 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5 podName:35b8e8ff-14c6-4807-bd77-b37eaea1544c nodeName:}" failed. No retries permitted until 2026-04-20 21:13:32.931660992 +0000 UTC m=+34.324846560 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdck5" (UniqueName: "kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5") pod "network-check-target-7pqmr" (UID: "35b8e8ff-14c6-4807-bd77-b37eaea1544c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:17.122405 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:17.122331 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:17.122837 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:17.122485 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:17.122837 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:17.122546 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:17.122837 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:17.122642 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:18.121687 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:18.121654 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:18.121876 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:18.121779 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:18.444887 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:18.444801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:18.445411 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:18.444933 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:18.445411 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:18.445004 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret podName:c487441d-7d18-4b60-b94d-d6e7a1fdc1a0 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:26.444984667 +0000 UTC m=+27.838170189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret") pod "global-pull-secret-syncer-rjql2" (UID: "c487441d-7d18-4b60-b94d-d6e7a1fdc1a0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:19.122180 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:19.122146 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:19.122356 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:19.122227 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:19.122356 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:19.122260 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:19.122356 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:19.122303 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:20.121438 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:20.121390 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:20.121792 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:20.121509 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:21.121846 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:21.121681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:21.122252 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:21.121743 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:21.122252 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:21.121935 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:21.122252 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:21.122031 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:21.226867 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:21.226823 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr6d2" event={"ID":"27234e93-fd7f-443e-8c3a-28ab70606c45","Type":"ContainerStarted","Data":"6d73c0b345ec1d736087dbf414eb25cefc6c3bfe6f3cb0c1ef6a64c57ac8e974"} Apr 20 21:13:21.228337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:21.228253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"a5097d27f83e4fe400ab115fde5f0e6b57035300c5274b4f644d67c1806161da"} Apr 20 21:13:21.960413 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:21.960217 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 21:13:22.044599 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.044529 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T21:13:21.960410577Z","UUID":"bef22926-d0af-416a-b370-1cbdac13ba8d","Handler":null,"Name":"","Endpoint":""} Apr 20 21:13:22.045750 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.045735 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 21:13:22.045828 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.045756 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 21:13:22.121624 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.121604 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:22.121806 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:22.121686 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:22.231001 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.230973 2567 generic.go:358] "Generic (PLEG): container finished" podID="b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0" containerID="e4181acefbb52c0e708943c34fa4e06ac599059ff1a30b99a243b8003d57ae5b" exitCode=0 Apr 20 21:13:22.231346 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.231050 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78h5h" event={"ID":"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0","Type":"ContainerDied","Data":"e4181acefbb52c0e708943c34fa4e06ac599059ff1a30b99a243b8003d57ae5b"} Apr 20 21:13:22.233870 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.233834 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"3d73a6c260553440dcd113c9a60c0e7245cebddc7d7102cf665377f524c1049f"} Apr 20 21:13:22.233870 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.233866 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"ab48a36fc5a64066bb64ffa17c34aac3394b5af39706570cbe111c15124c2b55"} Apr 20 21:13:22.234022 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.233880 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"164dd31111c759136837aa16882085b485e17c92149dd3f47429f1f560463dc3"} Apr 20 21:13:22.234022 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.233892 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"3a9262ada6814372cb101159aa4e494e5aecbb6061e61096a3430394f1424eae"} Apr 20 21:13:22.234022 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.233905 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"20e9158a3130965018a77d19e0f75bb8a4fa559e7afd1fad8db5340c1f21f843"} Apr 20 21:13:22.235519 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.235493 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" event={"ID":"d4d4427b-ceca-4928-90cc-5ebacc067735","Type":"ContainerStarted","Data":"4a2909800b8e042700a53859b279108573fdcb854943633616e65c2f0848cb95"} Apr 20 21:13:22.253051 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:22.253006 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fr6d2" podStartSLOduration=3.961128759 podStartE2EDuration="23.252994386s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.726674317 +0000 UTC m=+3.119859836" lastFinishedPulling="2026-04-20 21:13:21.018539926 +0000 UTC m=+22.411725463" observedRunningTime="2026-04-20 21:13:21.244497862 +0000 UTC m=+22.637683407" watchObservedRunningTime="2026-04-20 21:13:22.252994386 +0000 UTC m=+23.646179926" Apr 20 21:13:23.121523 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:23.121488 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:23.121695 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:23.121483 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:23.121695 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:23.121585 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:23.121695 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:23.121658 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:24.122088 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:24.122062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:24.122600 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:24.122158 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:24.241186 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:24.241157 2567 generic.go:358] "Generic (PLEG): container finished" podID="b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0" containerID="c06bc384dcbcfa66a1a0588257339e681795dd8c30c6aced04c62ceadfb7458f" exitCode=0 Apr 20 21:13:24.241281 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:24.241234 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78h5h" event={"ID":"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0","Type":"ContainerDied","Data":"c06bc384dcbcfa66a1a0588257339e681795dd8c30c6aced04c62ceadfb7458f"} Apr 20 21:13:24.246095 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:24.246073 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" event={"ID":"d4d4427b-ceca-4928-90cc-5ebacc067735","Type":"ContainerStarted","Data":"9950cea864edd0baf18ab3d4b3fb42aaad055a0719b3378188826c34e0ccdcf0"} Apr 20 21:13:24.278204 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:24.278160 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ftpsm" podStartSLOduration=3.548818962 podStartE2EDuration="25.278146727s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.723956429 +0000 UTC m=+3.117141948" lastFinishedPulling="2026-04-20 21:13:23.453284178 +0000 UTC m=+24.846469713" observedRunningTime="2026-04-20 21:13:24.277893665 +0000 UTC m=+25.671079205" watchObservedRunningTime="2026-04-20 21:13:24.278146727 +0000 UTC m=+25.671332267" Apr 20 21:13:25.122277 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:25.122251 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:25.122638 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:25.122251 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:25.122638 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:25.122369 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:25.122638 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:25.122433 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:25.250009 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:25.249955 2567 generic.go:358] "Generic (PLEG): container finished" podID="b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0" containerID="4f6b9da6ec82ac5bdaf42732df6c50cb7d86c3509a867088485d3acba420ef68" exitCode=0 Apr 20 21:13:25.250097 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:25.250012 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78h5h" event={"ID":"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0","Type":"ContainerDied","Data":"4f6b9da6ec82ac5bdaf42732df6c50cb7d86c3509a867088485d3acba420ef68"} Apr 20 21:13:25.253167 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:25.253141 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"53a49679f33b6556abc6e9af3f232232a1295ca25876ebfaa5ec73259b8c3e71"} Apr 20 21:13:26.122083 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:26.122043 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:26.122254 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:26.122170 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:26.511205 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:26.510993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:26.511684 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:26.511153 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:26.511684 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:26.511342 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret podName:c487441d-7d18-4b60-b94d-d6e7a1fdc1a0 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:42.511318393 +0000 UTC m=+43.904503915 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret") pod "global-pull-secret-syncer-rjql2" (UID: "c487441d-7d18-4b60-b94d-d6e7a1fdc1a0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:13:27.121976 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:27.121943 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:27.122139 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:27.122071 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:27.122202 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:27.122161 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:27.122287 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:27.122265 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:27.260590 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:27.260552 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" event={"ID":"b73ab204-677f-45ce-8d9d-26e042fd308c","Type":"ContainerStarted","Data":"5f29f6cb6be6f6ba20e74482a75f823a2d93e512b65a2b040056bb18614cbe5d"} Apr 20 21:13:27.260916 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:27.260883 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:27.277254 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:27.277227 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:27.313453 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:27.313398 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" podStartSLOduration=9.062934688 podStartE2EDuration="28.313384449s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.723404855 +0000 UTC m=+3.116590374" lastFinishedPulling="2026-04-20 21:13:20.973854617 +0000 UTC m=+22.367040135" observedRunningTime="2026-04-20 21:13:27.313186088 +0000 UTC m=+28.706371628" watchObservedRunningTime="2026-04-20 21:13:27.313384449 +0000 UTC m=+28.706569991" Apr 20 21:13:28.122253 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.122214 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:28.122781 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:28.122396 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:28.262776 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.262749 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:28.262924 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.262786 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:28.278899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.278877 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:13:28.305897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.305689 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6p5ds"] Apr 20 21:13:28.305897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.305809 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:28.306035 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:28.305917 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:28.307961 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.307936 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rjql2"] Apr 20 21:13:28.308095 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.308027 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:28.308167 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:28.308120 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:28.309130 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.309109 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7pqmr"] Apr 20 21:13:28.309219 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:28.309211 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:28.309333 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:28.309310 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:30.122376 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:30.122342 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:30.122889 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:30.122384 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:30.122889 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:30.122354 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:30.122889 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:30.122489 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:30.122889 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:30.122591 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:30.122889 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:30.122677 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:31.270177 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:31.270003 2567 generic.go:358] "Generic (PLEG): container finished" podID="b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0" containerID="a7448183cea0501d109772eb789f6b1df41c6fc0fdcb8489c589c131e6da31c3" exitCode=0 Apr 20 21:13:31.270479 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:31.270086 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78h5h" event={"ID":"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0","Type":"ContainerDied","Data":"a7448183cea0501d109772eb789f6b1df41c6fc0fdcb8489c589c131e6da31c3"} Apr 20 21:13:32.122350 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:32.122321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:32.122523 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:32.122321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:32.122523 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.122442 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:32.122635 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.122551 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:32.122635 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:32.122321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:32.122732 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.122651 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:32.274791 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:32.274765 2567 generic.go:358] "Generic (PLEG): container finished" podID="b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0" containerID="188357307f81d8f89129c101e222c64b193a76ad10a3aeb011c69ce60fd41a53" exitCode=0 Apr 20 21:13:32.275084 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:32.274825 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78h5h" event={"ID":"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0","Type":"ContainerDied","Data":"188357307f81d8f89129c101e222c64b193a76ad10a3aeb011c69ce60fd41a53"} Apr 20 21:13:32.861753 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:32.861726 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:32.861895 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.861836 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:32.861895 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.861881 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs podName:ec3d4534-0f04-46f4-8eae-d37ac21ac0c6 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:04.861866957 +0000 UTC m=+66.255052476 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs") pod "network-metrics-daemon-6p5ds" (UID: "ec3d4534-0f04-46f4-8eae-d37ac21ac0c6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:32.962710 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:32.962654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:32.962808 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.962793 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:32.962845 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.962809 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:32.962845 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.962818 2567 projected.go:194] Error preparing data for projected volume kube-api-access-kdck5 for pod openshift-network-diagnostics/network-check-target-7pqmr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:32.962906 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:32.962862 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5 podName:35b8e8ff-14c6-4807-bd77-b37eaea1544c nodeName:}" failed. No retries permitted until 2026-04-20 21:14:04.962849278 +0000 UTC m=+66.356034803 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdck5" (UniqueName: "kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5") pod "network-check-target-7pqmr" (UID: "35b8e8ff-14c6-4807-bd77-b37eaea1544c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:33.279852 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:33.279738 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-78h5h" event={"ID":"b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0","Type":"ContainerStarted","Data":"dece910d006fa8fbe79e8e44c52db87e3697216c855cc3272d51296dd1af3075"} Apr 20 21:13:33.303233 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:33.303194 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-78h5h" podStartSLOduration=5.082040958 podStartE2EDuration="34.3031833s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:13:01.728754808 +0000 UTC m=+3.121940342" lastFinishedPulling="2026-04-20 21:13:30.949897151 +0000 UTC m=+32.343082684" observedRunningTime="2026-04-20 21:13:33.301910986 +0000 UTC m=+34.695096599" watchObservedRunningTime="2026-04-20 21:13:33.3031833 +0000 UTC m=+34.696368842" Apr 20 21:13:34.122003 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.121975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:34.122003 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.121975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:34.122253 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:34.122072 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7pqmr" podUID="35b8e8ff-14c6-4807-bd77-b37eaea1544c" Apr 20 21:13:34.122253 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:34.122117 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6p5ds" podUID="ec3d4534-0f04-46f4-8eae-d37ac21ac0c6" Apr 20 21:13:34.122253 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.122153 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:34.122253 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:34.122222 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rjql2" podUID="c487441d-7d18-4b60-b94d-d6e7a1fdc1a0" Apr 20 21:13:34.461091 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.461068 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-45.ec2.internal" event="NodeReady" Apr 20 21:13:34.461376 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.461178 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 21:13:34.507340 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.507292 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qq8hb"] Apr 20 21:13:34.549952 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.549931 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9n92s"] Apr 20 21:13:34.550108 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.550089 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.552566 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.552548 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtslr\"" Apr 20 21:13:34.552668 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.552608 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 21:13:34.552816 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.552803 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 21:13:34.563393 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.563378 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qq8hb"] Apr 20 21:13:34.563473 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.563398 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9n92s"] Apr 20 21:13:34.563473 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.563471 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:34.568897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.568592 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 21:13:34.568897 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.568690 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 21:13:34.569073 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.568987 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 21:13:34.569265 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.569246 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kkzr9\"" Apr 20 21:13:34.674984 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.674936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:34.674984 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.674968 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cblbw\" (UniqueName: \"kubernetes.io/projected/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-kube-api-access-cblbw\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:34.675121 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.674986 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ba83c57-a195-426d-ab9b-969d9434f8d7-tmp-dir\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.675121 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.675007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba83c57-a195-426d-ab9b-969d9434f8d7-config-volume\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.675121 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.675052 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.675121 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.675069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmqbz\" (UniqueName: \"kubernetes.io/projected/9ba83c57-a195-426d-ab9b-969d9434f8d7-kube-api-access-mmqbz\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.775406 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.775386 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:34.775492 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.775409 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cblbw\" (UniqueName: \"kubernetes.io/projected/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-kube-api-access-cblbw\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:34.775492 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.775445 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ba83c57-a195-426d-ab9b-969d9434f8d7-tmp-dir\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.775492 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.775468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba83c57-a195-426d-ab9b-969d9434f8d7-config-volume\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.775600 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:34.775527 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:34.775600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.775583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.775600 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:34.775594 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert podName:a0f24ff9-e85c-47ac-9ee4-3deb25a046a8 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:35.275576422 +0000 UTC m=+36.668761947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert") pod "ingress-canary-9n92s" (UID: "a0f24ff9-e85c-47ac-9ee4-3deb25a046a8") : secret "canary-serving-cert" not found Apr 20 21:13:34.775737 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.775629 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmqbz\" (UniqueName: \"kubernetes.io/projected/9ba83c57-a195-426d-ab9b-969d9434f8d7-kube-api-access-mmqbz\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.775773 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:34.775749 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:34.775814 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.775778 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ba83c57-a195-426d-ab9b-969d9434f8d7-tmp-dir\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.775814 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:34.775798 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls podName:9ba83c57-a195-426d-ab9b-969d9434f8d7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:35.27578735 +0000 UTC m=+36.668972874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls") pod "dns-default-qq8hb" (UID: "9ba83c57-a195-426d-ab9b-969d9434f8d7") : secret "dns-default-metrics-tls" not found Apr 20 21:13:34.775988 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.775972 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba83c57-a195-426d-ab9b-969d9434f8d7-config-volume\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.785707 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.785685 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmqbz\" (UniqueName: \"kubernetes.io/projected/9ba83c57-a195-426d-ab9b-969d9434f8d7-kube-api-access-mmqbz\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:34.785779 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:34.785756 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cblbw\" (UniqueName: \"kubernetes.io/projected/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-kube-api-access-cblbw\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:35.279626 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:35.279599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:35.279752 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:35.279672 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:35.279752 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:35.279689 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:35.279752 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:35.279745 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls podName:9ba83c57-a195-426d-ab9b-969d9434f8d7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:36.279732339 +0000 UTC m=+37.672917857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls") pod "dns-default-qq8hb" (UID: "9ba83c57-a195-426d-ab9b-969d9434f8d7") : secret "dns-default-metrics-tls" not found Apr 20 21:13:35.279752 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:35.279749 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:35.279888 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:35.279795 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert podName:a0f24ff9-e85c-47ac-9ee4-3deb25a046a8 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:36.279783072 +0000 UTC m=+37.672968590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert") pod "ingress-canary-9n92s" (UID: "a0f24ff9-e85c-47ac-9ee4-3deb25a046a8") : secret "canary-serving-cert" not found Apr 20 21:13:36.122386 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.122351 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:13:36.122971 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.122358 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:13:36.122971 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.122358 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:36.125339 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.125314 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 21:13:36.126251 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.126222 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs7p8\"" Apr 20 21:13:36.126382 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.126254 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:13:36.126382 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.126309 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:13:36.126382 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.126329 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:13:36.126382 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.126263 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lxd5q\"" Apr 20 21:13:36.285678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.285657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:36.285768 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:36.285694 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:36.285829 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:36.285797 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:36.285883 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:36.285853 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert podName:a0f24ff9-e85c-47ac-9ee4-3deb25a046a8 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:38.2858382 +0000 UTC m=+39.679023719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert") pod "ingress-canary-9n92s" (UID: "a0f24ff9-e85c-47ac-9ee4-3deb25a046a8") : secret "canary-serving-cert" not found Apr 20 21:13:36.285943 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:36.285893 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:36.285996 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:36.285973 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls podName:9ba83c57-a195-426d-ab9b-969d9434f8d7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:38.285953499 +0000 UTC m=+39.679139067 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls") pod "dns-default-qq8hb" (UID: "9ba83c57-a195-426d-ab9b-969d9434f8d7") : secret "dns-default-metrics-tls" not found Apr 20 21:13:38.298702 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:38.298678 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:38.299040 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:38.298718 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:38.299040 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:38.298817 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:38.299040 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:38.298866 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls podName:9ba83c57-a195-426d-ab9b-969d9434f8d7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:42.29885168 +0000 UTC m=+43.692037198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls") pod "dns-default-qq8hb" (UID: "9ba83c57-a195-426d-ab9b-969d9434f8d7") : secret "dns-default-metrics-tls" not found Apr 20 21:13:38.299040 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:38.298817 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:38.299040 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:38.298929 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert podName:a0f24ff9-e85c-47ac-9ee4-3deb25a046a8 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:42.298917892 +0000 UTC m=+43.692103430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert") pod "ingress-canary-9n92s" (UID: "a0f24ff9-e85c-47ac-9ee4-3deb25a046a8") : secret "canary-serving-cert" not found Apr 20 21:13:42.329791 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:42.329758 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:42.330325 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:42.329800 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:42.330325 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:42.329925 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:42.330325 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:42.329927 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:42.330325 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:42.329973 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls podName:9ba83c57-a195-426d-ab9b-969d9434f8d7 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:50.329959468 +0000 UTC m=+51.723144988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls") pod "dns-default-qq8hb" (UID: "9ba83c57-a195-426d-ab9b-969d9434f8d7") : secret "dns-default-metrics-tls" not found Apr 20 21:13:42.330325 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:42.329986 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert podName:a0f24ff9-e85c-47ac-9ee4-3deb25a046a8 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:50.329980265 +0000 UTC m=+51.723165784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert") pod "ingress-canary-9n92s" (UID: "a0f24ff9-e85c-47ac-9ee4-3deb25a046a8") : secret "canary-serving-cert" not found Apr 20 21:13:42.530618 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:42.530586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:42.532770 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:42.532753 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c487441d-7d18-4b60-b94d-d6e7a1fdc1a0-original-pull-secret\") pod \"global-pull-secret-syncer-rjql2\" (UID: \"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0\") " pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:42.743236 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:42.743182 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rjql2" Apr 20 21:13:42.922657 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:42.922483 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rjql2"] Apr 20 21:13:42.925956 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:42.925919 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc487441d_7d18_4b60_b94d_d6e7a1fdc1a0.slice/crio-5852d3756ca1c1ca8314ef688c0950c4393663bf6d1bcdaa903f5d290542f2d4 WatchSource:0}: Error finding container 5852d3756ca1c1ca8314ef688c0950c4393663bf6d1bcdaa903f5d290542f2d4: Status 404 returned error can't find the container with id 5852d3756ca1c1ca8314ef688c0950c4393663bf6d1bcdaa903f5d290542f2d4 Apr 20 21:13:42.993198 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:42.993172 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp"] Apr 20 21:13:43.011455 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.011364 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l"] Apr 20 21:13:43.011541 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.011521 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.013875 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.013855 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 21:13:43.014186 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.014165 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 21:13:43.015009 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.014989 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 21:13:43.015009 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.015002 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 21:13:43.022485 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.022469 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp"] Apr 20 21:13:43.022573 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.022489 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l"] Apr 20 21:13:43.022573 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.022557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" Apr 20 21:13:43.024648 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.024631 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-hpvqv\"" Apr 20 21:13:43.024720 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.024633 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 21:13:43.135467 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.135444 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfxk\" (UniqueName: \"kubernetes.io/projected/4ff15bc4-c5f1-4494-9602-463447062b85-kube-api-access-mmfxk\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.135546 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.135493 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/628b2a5b-c185-4f9b-8664-d42b2024f638-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-547f96c7bf-h8c4l\" (UID: \"628b2a5b-c185-4f9b-8664-d42b2024f638\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" Apr 20 21:13:43.135546 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.135515 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4ff15bc4-c5f1-4494-9602-463447062b85-klusterlet-config\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.135618 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.135586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5skcx\" (UniqueName: \"kubernetes.io/projected/628b2a5b-c185-4f9b-8664-d42b2024f638-kube-api-access-5skcx\") pod \"managed-serviceaccount-addon-agent-547f96c7bf-h8c4l\" (UID: \"628b2a5b-c185-4f9b-8664-d42b2024f638\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" Apr 20 21:13:43.135655 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.135618 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ff15bc4-c5f1-4494-9602-463447062b85-tmp\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.236659 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.236633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/628b2a5b-c185-4f9b-8664-d42b2024f638-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-547f96c7bf-h8c4l\" (UID: \"628b2a5b-c185-4f9b-8664-d42b2024f638\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" Apr 20 21:13:43.236749 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.236665 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4ff15bc4-c5f1-4494-9602-463447062b85-klusterlet-config\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.236749 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.236702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5skcx\" (UniqueName: \"kubernetes.io/projected/628b2a5b-c185-4f9b-8664-d42b2024f638-kube-api-access-5skcx\") pod \"managed-serviceaccount-addon-agent-547f96c7bf-h8c4l\" (UID: \"628b2a5b-c185-4f9b-8664-d42b2024f638\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" Apr 20 21:13:43.236749 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.236731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ff15bc4-c5f1-4494-9602-463447062b85-tmp\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.236869 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.236794 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfxk\" (UniqueName: \"kubernetes.io/projected/4ff15bc4-c5f1-4494-9602-463447062b85-kube-api-access-mmfxk\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.237171 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.237153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ff15bc4-c5f1-4494-9602-463447062b85-tmp\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.239830 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.239810 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4ff15bc4-c5f1-4494-9602-463447062b85-klusterlet-config\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.239830 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.239826 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/628b2a5b-c185-4f9b-8664-d42b2024f638-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-547f96c7bf-h8c4l\" (UID: \"628b2a5b-c185-4f9b-8664-d42b2024f638\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" Apr 20 21:13:43.245018 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.244995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfxk\" (UniqueName: \"kubernetes.io/projected/4ff15bc4-c5f1-4494-9602-463447062b85-kube-api-access-mmfxk\") pod \"klusterlet-addon-workmgr-666fbdcb5b-prlvp\" (UID: \"4ff15bc4-c5f1-4494-9602-463447062b85\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.245157 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.245142 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5skcx\" (UniqueName: \"kubernetes.io/projected/628b2a5b-c185-4f9b-8664-d42b2024f638-kube-api-access-5skcx\") pod \"managed-serviceaccount-addon-agent-547f96c7bf-h8c4l\" (UID: \"628b2a5b-c185-4f9b-8664-d42b2024f638\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" Apr 20 21:13:43.298276 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.298253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rjql2" event={"ID":"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0","Type":"ContainerStarted","Data":"5852d3756ca1c1ca8314ef688c0950c4393663bf6d1bcdaa903f5d290542f2d4"} Apr 20 21:13:43.321475 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.321453 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:43.337946 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.337925 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" Apr 20 21:13:43.440730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.440704 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp"] Apr 20 21:13:43.444364 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:43.444340 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ff15bc4_c5f1_4494_9602_463447062b85.slice/crio-6a45d60dc62a1a460ce11fdd064470ae86d880bb0eb5d28998df46a583297f10 WatchSource:0}: Error finding container 6a45d60dc62a1a460ce11fdd064470ae86d880bb0eb5d28998df46a583297f10: Status 404 returned error can't find the container with id 6a45d60dc62a1a460ce11fdd064470ae86d880bb0eb5d28998df46a583297f10 Apr 20 21:13:43.450698 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:43.450677 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l"] Apr 20 21:13:43.453249 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:13:43.453227 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod628b2a5b_c185_4f9b_8664_d42b2024f638.slice/crio-edd0dadd66535a88f372f3a3ab3caca7cc5b1e99dab9bb15ed6db0a0530c5d63 WatchSource:0}: Error finding container edd0dadd66535a88f372f3a3ab3caca7cc5b1e99dab9bb15ed6db0a0530c5d63: Status 404 returned error can't find the container with id edd0dadd66535a88f372f3a3ab3caca7cc5b1e99dab9bb15ed6db0a0530c5d63 Apr 20 21:13:44.301915 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:44.301879 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" event={"ID":"4ff15bc4-c5f1-4494-9602-463447062b85","Type":"ContainerStarted","Data":"6a45d60dc62a1a460ce11fdd064470ae86d880bb0eb5d28998df46a583297f10"} Apr 20 21:13:44.303583 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:44.303555 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" event={"ID":"628b2a5b-c185-4f9b-8664-d42b2024f638","Type":"ContainerStarted","Data":"edd0dadd66535a88f372f3a3ab3caca7cc5b1e99dab9bb15ed6db0a0530c5d63"} Apr 20 21:13:50.316993 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.316955 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rjql2" event={"ID":"c487441d-7d18-4b60-b94d-d6e7a1fdc1a0","Type":"ContainerStarted","Data":"e3417480ba056a934e02d3fb78ddf46bef9f2143e4e2c274d9b50ab25abd2b5c"} Apr 20 21:13:50.318198 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.318173 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" event={"ID":"4ff15bc4-c5f1-4494-9602-463447062b85","Type":"ContainerStarted","Data":"b151e930f8fa33849cd878456063c198610685808e49887273c48c1bc5c618a3"} Apr 20 21:13:50.318401 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.318382 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:50.319658 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.319603 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" event={"ID":"628b2a5b-c185-4f9b-8664-d42b2024f638","Type":"ContainerStarted","Data":"1c46393c6a4d5fbc2c2052684e3f857df2c552c7333b0c0ac47a981ed679058b"} Apr 20 21:13:50.319960 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.319942 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" Apr 20 21:13:50.331544 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.331505 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rjql2" podStartSLOduration=33.262730169 podStartE2EDuration="40.331494263s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:42.927554445 +0000 UTC m=+44.320739964" lastFinishedPulling="2026-04-20 21:13:49.996318539 +0000 UTC m=+51.389504058" observedRunningTime="2026-04-20 21:13:50.331006864 +0000 UTC m=+51.724192404" watchObservedRunningTime="2026-04-20 21:13:50.331494263 +0000 UTC m=+51.724679804" Apr 20 21:13:50.345330 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.345298 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-666fbdcb5b-prlvp" podStartSLOduration=1.779483381 podStartE2EDuration="8.345289236s" podCreationTimestamp="2026-04-20 21:13:42 +0000 UTC" firstStartedPulling="2026-04-20 21:13:43.445994985 +0000 UTC m=+44.839180507" lastFinishedPulling="2026-04-20 21:13:50.011800839 +0000 UTC m=+51.404986362" observedRunningTime="2026-04-20 21:13:50.344897882 +0000 UTC m=+51.738083424" watchObservedRunningTime="2026-04-20 21:13:50.345289236 +0000 UTC m=+51.738474777" Apr 20 21:13:50.359019 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.358984 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-547f96c7bf-h8c4l" podStartSLOduration=1.817951444 podStartE2EDuration="8.358974134s" podCreationTimestamp="2026-04-20 21:13:42 +0000 UTC" firstStartedPulling="2026-04-20 21:13:43.454909162 +0000 UTC m=+44.848094680" lastFinishedPulling="2026-04-20 21:13:49.995931847 +0000 UTC m=+51.389117370" observedRunningTime="2026-04-20 21:13:50.358608942 +0000 UTC m=+51.751794483" watchObservedRunningTime="2026-04-20 21:13:50.358974134 +0000 UTC m=+51.752159665" Apr 20 21:13:50.398843 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.398818 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:13:50.398924 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:13:50.398854 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:13:50.398962 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:50.398946 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:50.398962 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:50.398951 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:50.399026 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:50.398989 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls podName:9ba83c57-a195-426d-ab9b-969d9434f8d7 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:06.398977775 +0000 UTC m=+67.792163294 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls") pod "dns-default-qq8hb" (UID: "9ba83c57-a195-426d-ab9b-969d9434f8d7") : secret "dns-default-metrics-tls" not found Apr 20 21:13:50.399026 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:13:50.399005 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert podName:a0f24ff9-e85c-47ac-9ee4-3deb25a046a8 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:06.398993746 +0000 UTC m=+67.792179270 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert") pod "ingress-canary-9n92s" (UID: "a0f24ff9-e85c-47ac-9ee4-3deb25a046a8") : secret "canary-serving-cert" not found Apr 20 21:14:00.277266 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:00.277234 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5qmcs" Apr 20 21:14:04.895196 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:04.895154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:14:04.897751 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:04.897728 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:14:04.905802 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:04.905784 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 21:14:04.905864 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:04.905842 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs podName:ec3d4534-0f04-46f4-8eae-d37ac21ac0c6 nodeName:}" failed. No retries permitted until 2026-04-20 21:15:08.905827129 +0000 UTC m=+130.299012664 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs") pod "network-metrics-daemon-6p5ds" (UID: "ec3d4534-0f04-46f4-8eae-d37ac21ac0c6") : secret "metrics-daemon-secret" not found Apr 20 21:14:04.996252 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:04.996231 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:14:04.998872 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:04.998855 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:14:05.009105 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:05.009086 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:14:05.019590 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:05.019566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdck5\" (UniqueName: \"kubernetes.io/projected/35b8e8ff-14c6-4807-bd77-b37eaea1544c-kube-api-access-kdck5\") pod \"network-check-target-7pqmr\" (UID: \"35b8e8ff-14c6-4807-bd77-b37eaea1544c\") " pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:14:05.235928 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:05.235873 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs7p8\"" Apr 20 21:14:05.243873 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:05.243852 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:14:05.352106 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:05.352074 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7pqmr"] Apr 20 21:14:05.356672 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:05.356641 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b8e8ff_14c6_4807_bd77_b37eaea1544c.slice/crio-fece6cb68b96bf29b6863064f5377a4fdab94202143ad6b9f98c47db0c8e666f WatchSource:0}: Error finding container fece6cb68b96bf29b6863064f5377a4fdab94202143ad6b9f98c47db0c8e666f: Status 404 returned error can't find the container with id fece6cb68b96bf29b6863064f5377a4fdab94202143ad6b9f98c47db0c8e666f Apr 20 21:14:06.350519 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:06.350474 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7pqmr" event={"ID":"35b8e8ff-14c6-4807-bd77-b37eaea1544c","Type":"ContainerStarted","Data":"fece6cb68b96bf29b6863064f5377a4fdab94202143ad6b9f98c47db0c8e666f"} Apr 20 21:14:06.405212 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:06.405177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:14:06.405364 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:06.405239 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:14:06.405364 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:06.405353 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:14:06.405503 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:06.405367 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:14:06.405503 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:06.405436 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert podName:a0f24ff9-e85c-47ac-9ee4-3deb25a046a8 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:38.405408529 +0000 UTC m=+99.798594047 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert") pod "ingress-canary-9n92s" (UID: "a0f24ff9-e85c-47ac-9ee4-3deb25a046a8") : secret "canary-serving-cert" not found Apr 20 21:14:06.405503 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:06.405459 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls podName:9ba83c57-a195-426d-ab9b-969d9434f8d7 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:38.405448937 +0000 UTC m=+99.798634474 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls") pod "dns-default-qq8hb" (UID: "9ba83c57-a195-426d-ab9b-969d9434f8d7") : secret "dns-default-metrics-tls" not found Apr 20 21:14:08.074556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.074528 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7"] Apr 20 21:14:08.077248 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.077233 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.079811 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.079792 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 21:14:08.080253 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.080185 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-44n8l\"" Apr 20 21:14:08.080253 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.080224 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 21:14:08.080410 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.080326 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 21:14:08.083489 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.082613 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 21:14:08.087380 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.087362 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7"] Apr 20 21:14:08.116033 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.116013 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.116131 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.116042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7c8850e0-79f1-40dd-be01-35e964ad62be-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.116131 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.116060 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qld5\" (UniqueName: \"kubernetes.io/projected/7c8850e0-79f1-40dd-be01-35e964ad62be-kube-api-access-4qld5\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.216513 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.216454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.216513 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.216495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7c8850e0-79f1-40dd-be01-35e964ad62be-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.216513 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.216512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qld5\" (UniqueName: \"kubernetes.io/projected/7c8850e0-79f1-40dd-be01-35e964ad62be-kube-api-access-4qld5\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.216658 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:08.216588 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:08.216658 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:08.216655 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls podName:7c8850e0-79f1-40dd-be01-35e964ad62be nodeName:}" failed. No retries permitted until 2026-04-20 21:14:08.716640604 +0000 UTC m=+70.109826123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47gc7" (UID: "7c8850e0-79f1-40dd-be01-35e964ad62be") : secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:08.217152 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.217134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7c8850e0-79f1-40dd-be01-35e964ad62be-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.225365 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.225336 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qld5\" (UniqueName: \"kubernetes.io/projected/7c8850e0-79f1-40dd-be01-35e964ad62be-kube-api-access-4qld5\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.356148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.356111 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7pqmr" event={"ID":"35b8e8ff-14c6-4807-bd77-b37eaea1544c","Type":"ContainerStarted","Data":"972fee3c6d346ddc70dfd55437cd302b146c9689fce5d16e9f0d6325624a0207"} Apr 20 21:14:08.356309 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.356290 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:14:08.371084 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.371048 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7pqmr" podStartSLOduration=66.817291532 podStartE2EDuration="1m9.371037837s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:14:05.358501577 +0000 UTC m=+66.751687096" lastFinishedPulling="2026-04-20 21:14:07.912247879 +0000 UTC m=+69.305433401" observedRunningTime="2026-04-20 21:14:08.370835633 +0000 UTC m=+69.764021188" watchObservedRunningTime="2026-04-20 21:14:08.371037837 +0000 UTC m=+69.764223377" Apr 20 21:14:08.720486 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.720444 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:08.720613 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:08.720595 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:08.720679 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:08.720667 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls podName:7c8850e0-79f1-40dd-be01-35e964ad62be nodeName:}" failed. No retries permitted until 2026-04-20 21:14:09.720649801 +0000 UTC m=+71.113835332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47gc7" (UID: "7c8850e0-79f1-40dd-be01-35e964ad62be") : secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:08.787487 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.787464 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw"] Apr 20 21:14:08.790384 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.790369 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.794880 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.794854 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 21:14:08.795016 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.794929 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 21:14:08.795016 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.794934 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 21:14:08.795138 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.795045 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-knzgs\"" Apr 20 21:14:08.795138 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.795054 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:14:08.799514 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.799491 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw"] Apr 20 21:14:08.821312 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.821289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4535173-97a4-4c0b-aba0-a435bd525510-config\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.821402 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.821334 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frnf6\" (UniqueName: \"kubernetes.io/projected/b4535173-97a4-4c0b-aba0-a435bd525510-kube-api-access-frnf6\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.821402 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.821381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4535173-97a4-4c0b-aba0-a435bd525510-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.922327 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.922306 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4535173-97a4-4c0b-aba0-a435bd525510-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.922398 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.922347 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4535173-97a4-4c0b-aba0-a435bd525510-config\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.922489 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.922472 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frnf6\" (UniqueName: \"kubernetes.io/projected/b4535173-97a4-4c0b-aba0-a435bd525510-kube-api-access-frnf6\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.922880 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.922855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4535173-97a4-4c0b-aba0-a435bd525510-config\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.924396 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.924379 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4535173-97a4-4c0b-aba0-a435bd525510-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:08.930559 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:08.930540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frnf6\" (UniqueName: \"kubernetes.io/projected/b4535173-97a4-4c0b-aba0-a435bd525510-kube-api-access-frnf6\") pod \"service-ca-operator-d6fc45fc5-sfxjw\" (UID: \"b4535173-97a4-4c0b-aba0-a435bd525510\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:09.099622 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:09.099604 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" Apr 20 21:14:09.211329 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:09.211295 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw"] Apr 20 21:14:09.215810 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:09.215782 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4535173_97a4_4c0b_aba0_a435bd525510.slice/crio-4543d4f3c3528cb10ef9d51c01c7de711328b3fa8e05dadfd3378b19f5be386f WatchSource:0}: Error finding container 4543d4f3c3528cb10ef9d51c01c7de711328b3fa8e05dadfd3378b19f5be386f: Status 404 returned error can't find the container with id 4543d4f3c3528cb10ef9d51c01c7de711328b3fa8e05dadfd3378b19f5be386f Apr 20 21:14:09.359270 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:09.359218 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" event={"ID":"b4535173-97a4-4c0b-aba0-a435bd525510","Type":"ContainerStarted","Data":"4543d4f3c3528cb10ef9d51c01c7de711328b3fa8e05dadfd3378b19f5be386f"} Apr 20 21:14:09.727844 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:09.727789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:09.727940 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:09.727922 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:09.727995 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:09.727985 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls podName:7c8850e0-79f1-40dd-be01-35e964ad62be nodeName:}" failed. No retries permitted until 2026-04-20 21:14:11.727970964 +0000 UTC m=+73.121156483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47gc7" (UID: "7c8850e0-79f1-40dd-be01-35e964ad62be") : secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:11.742739 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:11.742661 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:11.743129 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:11.742798 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:11.743129 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:11.742856 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls podName:7c8850e0-79f1-40dd-be01-35e964ad62be nodeName:}" failed. No retries permitted until 2026-04-20 21:14:15.742841798 +0000 UTC m=+77.136027317 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47gc7" (UID: "7c8850e0-79f1-40dd-be01-35e964ad62be") : secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:12.366593 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:12.366558 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" event={"ID":"b4535173-97a4-4c0b-aba0-a435bd525510","Type":"ContainerStarted","Data":"efb1190a603d65c3dca9a7e90692f8b180bd2759fe08e171a8af311c51e1d5be"} Apr 20 21:14:12.381855 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:12.381815 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" podStartSLOduration=2.177654756 podStartE2EDuration="4.381800366s" podCreationTimestamp="2026-04-20 21:14:08 +0000 UTC" firstStartedPulling="2026-04-20 21:14:09.21757717 +0000 UTC m=+70.610762693" lastFinishedPulling="2026-04-20 21:14:11.421722771 +0000 UTC m=+72.814908303" observedRunningTime="2026-04-20 21:14:12.381667003 +0000 UTC m=+73.774852544" watchObservedRunningTime="2026-04-20 21:14:12.381800366 +0000 UTC m=+73.774985909" Apr 20 21:14:15.770378 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:15.770336 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:15.770813 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:15.770479 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:15.770813 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:15.770534 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls podName:7c8850e0-79f1-40dd-be01-35e964ad62be nodeName:}" failed. No retries permitted until 2026-04-20 21:14:23.770519163 +0000 UTC m=+85.163704682 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47gc7" (UID: "7c8850e0-79f1-40dd-be01-35e964ad62be") : secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:15.895442 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:15.895399 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c5gzg_e05ebd33-1ace-4f1d-8379-0925d5e79b13/dns-node-resolver/0.log" Apr 20 21:14:16.695783 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:16.695752 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jk6wx_2a4eea4f-8eef-446c-a518-bcb5b140ee35/node-ca/0.log" Apr 20 21:14:23.827334 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:23.827301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:23.827716 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:23.827464 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:23.827716 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:23.827546 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls podName:7c8850e0-79f1-40dd-be01-35e964ad62be nodeName:}" failed. No retries permitted until 2026-04-20 21:14:39.827532399 +0000 UTC m=+101.220717921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-47gc7" (UID: "7c8850e0-79f1-40dd-be01-35e964ad62be") : secret "cluster-monitoring-operator-tls" not found Apr 20 21:14:34.088472 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.088418 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-trkwb"] Apr 20 21:14:34.095559 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.095539 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" Apr 20 21:14:34.098181 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.098154 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 21:14:34.099074 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.099053 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 21:14:34.099170 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.099107 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zjtfj\"" Apr 20 21:14:34.101073 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.101047 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-trkwb"] Apr 20 21:14:34.142875 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.142854 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9gggf"] Apr 20 21:14:34.145938 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.145923 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.148561 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.148537 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 21:14:34.148984 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.148968 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 21:14:34.149139 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.149121 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 21:14:34.149139 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.149137 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 21:14:34.149245 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.149165 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4kwqq\"" Apr 20 21:14:34.162266 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.162242 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9gggf"] Apr 20 21:14:34.202033 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.202013 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.202137 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.202041 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktbb9\" (UniqueName: \"kubernetes.io/projected/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-kube-api-access-ktbb9\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.202137 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.202067 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/777c3126-65de-4706-a59a-112f7ec7916a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-trkwb\" (UID: \"777c3126-65de-4706-a59a-112f7ec7916a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" Apr 20 21:14:34.202224 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.202146 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-crio-socket\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.202224 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.202172 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/777c3126-65de-4706-a59a-112f7ec7916a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-trkwb\" (UID: \"777c3126-65de-4706-a59a-112f7ec7916a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" Apr 20 21:14:34.202287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.202254 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-data-volume\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.202317 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.202291 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.220859 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.220818 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56c5557f99-ktv6g"] Apr 20 21:14:34.223852 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.223837 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.229137 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.229116 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 21:14:34.229232 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.229210 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4cm2d\"" Apr 20 21:14:34.230199 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.230175 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 21:14:34.230373 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.230356 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 21:14:34.243524 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.243506 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 21:14:34.268134 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.268116 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56c5557f99-ktv6g"] Apr 20 21:14:34.288993 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.288974 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56c5557f99-ktv6g"] Apr 20 21:14:34.289122 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:34.289105 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-sg2fb registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" podUID="4b753347-4d97-4cf2-bf6d-2601352892a0" Apr 20 21:14:34.303198 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303175 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-certificates\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.303276 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303208 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/777c3126-65de-4706-a59a-112f7ec7916a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-trkwb\" (UID: \"777c3126-65de-4706-a59a-112f7ec7916a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" Apr 20 21:14:34.303276 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-crio-socket\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.303276 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303249 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-data-volume\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.303412 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303276 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.303412 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303292 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-installation-pull-secrets\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.303412 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-crio-socket\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.303412 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303315 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.303412 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktbb9\" (UniqueName: \"kubernetes.io/projected/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-kube-api-access-ktbb9\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.303412 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303364 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/777c3126-65de-4706-a59a-112f7ec7916a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-trkwb\" (UID: \"777c3126-65de-4706-a59a-112f7ec7916a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" Apr 20 21:14:34.303412 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303382 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b753347-4d97-4cf2-bf6d-2601352892a0-ca-trust-extracted\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.303754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303448 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-tls\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.303754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303475 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-bound-sa-token\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.303754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg2fb\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-kube-api-access-sg2fb\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.303754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303551 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-trusted-ca\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.303754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303622 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-image-registry-private-configuration\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.303754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303698 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-data-volume\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.304019 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303924 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/777c3126-65de-4706-a59a-112f7ec7916a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-trkwb\" (UID: \"777c3126-65de-4706-a59a-112f7ec7916a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" Apr 20 21:14:34.304019 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.303964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.306015 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.305991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.306093 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.306046 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/777c3126-65de-4706-a59a-112f7ec7916a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-trkwb\" (UID: \"777c3126-65de-4706-a59a-112f7ec7916a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" Apr 20 21:14:34.318154 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.318131 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktbb9\" (UniqueName: \"kubernetes.io/projected/2d4d9854-af8f-42c6-928a-3bbb105e3f5a-kube-api-access-ktbb9\") pod \"insights-runtime-extractor-9gggf\" (UID: \"2d4d9854-af8f-42c6-928a-3bbb105e3f5a\") " pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.404213 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.404162 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-tls\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.404213 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.404191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-bound-sa-token\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.404323 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.404208 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg2fb\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-kube-api-access-sg2fb\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.404323 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.404236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-trusted-ca\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.404767 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.404744 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" Apr 20 21:14:34.407287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.404631 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-image-registry-private-configuration\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.407287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.405448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-certificates\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.407287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.405645 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-installation-pull-secrets\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.407287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.405703 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-trusted-ca\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.407287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.405810 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b753347-4d97-4cf2-bf6d-2601352892a0-ca-trust-extracted\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.407287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.406808 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-certificates\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.407287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.406868 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b753347-4d97-4cf2-bf6d-2601352892a0-ca-trust-extracted\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.408540 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.408520 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-tls\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.409150 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.409121 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-installation-pull-secrets\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.409883 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.409862 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-image-registry-private-configuration\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.411118 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.411094 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.413119 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.413089 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-bound-sa-token\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.413644 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.413623 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg2fb\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-kube-api-access-sg2fb\") pod \"image-registry-56c5557f99-ktv6g\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.423765 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.423749 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:34.454148 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.454095 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9gggf" Apr 20 21:14:34.507662 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.507257 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg2fb\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-kube-api-access-sg2fb\") pod \"4b753347-4d97-4cf2-bf6d-2601352892a0\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " Apr 20 21:14:34.507662 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.507320 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-trusted-ca\") pod \"4b753347-4d97-4cf2-bf6d-2601352892a0\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " Apr 20 21:14:34.507662 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.507357 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-bound-sa-token\") pod \"4b753347-4d97-4cf2-bf6d-2601352892a0\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " Apr 20 21:14:34.507662 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.507391 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-certificates\") pod \"4b753347-4d97-4cf2-bf6d-2601352892a0\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " Apr 20 21:14:34.510337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.508526 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4b753347-4d97-4cf2-bf6d-2601352892a0" (UID: "4b753347-4d97-4cf2-bf6d-2601352892a0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:14:34.510337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.508592 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4b753347-4d97-4cf2-bf6d-2601352892a0" (UID: "4b753347-4d97-4cf2-bf6d-2601352892a0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:14:34.510337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.508608 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-installation-pull-secrets\") pod \"4b753347-4d97-4cf2-bf6d-2601352892a0\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " Apr 20 21:14:34.510337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.508654 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-image-registry-private-configuration\") pod \"4b753347-4d97-4cf2-bf6d-2601352892a0\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " Apr 20 21:14:34.510337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.508705 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b753347-4d97-4cf2-bf6d-2601352892a0-ca-trust-extracted\") pod \"4b753347-4d97-4cf2-bf6d-2601352892a0\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " Apr 20 21:14:34.510337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.508740 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-tls\") pod \"4b753347-4d97-4cf2-bf6d-2601352892a0\" (UID: \"4b753347-4d97-4cf2-bf6d-2601352892a0\") " Apr 20 21:14:34.510337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.508985 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-certificates\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:14:34.510337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.509493 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b753347-4d97-4cf2-bf6d-2601352892a0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4b753347-4d97-4cf2-bf6d-2601352892a0" (UID: "4b753347-4d97-4cf2-bf6d-2601352892a0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:14:34.511018 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.510973 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-kube-api-access-sg2fb" (OuterVolumeSpecName: "kube-api-access-sg2fb") pod "4b753347-4d97-4cf2-bf6d-2601352892a0" (UID: "4b753347-4d97-4cf2-bf6d-2601352892a0"). InnerVolumeSpecName "kube-api-access-sg2fb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:14:34.512021 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.511915 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4b753347-4d97-4cf2-bf6d-2601352892a0" (UID: "4b753347-4d97-4cf2-bf6d-2601352892a0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:14:34.512021 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.511979 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4b753347-4d97-4cf2-bf6d-2601352892a0" (UID: "4b753347-4d97-4cf2-bf6d-2601352892a0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:14:34.512545 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.512505 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4b753347-4d97-4cf2-bf6d-2601352892a0" (UID: "4b753347-4d97-4cf2-bf6d-2601352892a0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:14:34.513034 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.512998 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4b753347-4d97-4cf2-bf6d-2601352892a0" (UID: "4b753347-4d97-4cf2-bf6d-2601352892a0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:14:34.530649 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.530628 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-trkwb"] Apr 20 21:14:34.533528 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:34.533483 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod777c3126_65de_4706_a59a_112f7ec7916a.slice/crio-d7bd08348604f8304fa9630c37909baea6a04d857f62135ebd918bc4037b8b49 WatchSource:0}: Error finding container d7bd08348604f8304fa9630c37909baea6a04d857f62135ebd918bc4037b8b49: Status 404 returned error can't find the container with id d7bd08348604f8304fa9630c37909baea6a04d857f62135ebd918bc4037b8b49 Apr 20 21:14:34.575360 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.575337 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9gggf"] Apr 20 21:14:34.580072 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:34.580053 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d4d9854_af8f_42c6_928a_3bbb105e3f5a.slice/crio-0f4d17e634a02aa6a8dfacc7c415b3590226158398a1dc2fabcfe59c1234ce4b WatchSource:0}: Error finding container 0f4d17e634a02aa6a8dfacc7c415b3590226158398a1dc2fabcfe59c1234ce4b: Status 404 returned error can't find the container with id 0f4d17e634a02aa6a8dfacc7c415b3590226158398a1dc2fabcfe59c1234ce4b Apr 20 21:14:34.609549 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.609525 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-installation-pull-secrets\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:14:34.609549 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.609547 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b753347-4d97-4cf2-bf6d-2601352892a0-image-registry-private-configuration\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:14:34.609670 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.609558 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b753347-4d97-4cf2-bf6d-2601352892a0-ca-trust-extracted\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:14:34.609670 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.609567 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-registry-tls\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:14:34.609670 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.609576 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sg2fb\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-kube-api-access-sg2fb\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:14:34.609670 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.609585 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b753347-4d97-4cf2-bf6d-2601352892a0-trusted-ca\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:14:34.609670 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:34.609593 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b753347-4d97-4cf2-bf6d-2601352892a0-bound-sa-token\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:14:35.415649 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:35.415536 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gggf" event={"ID":"2d4d9854-af8f-42c6-928a-3bbb105e3f5a","Type":"ContainerStarted","Data":"6e26709a9655f0d27fb08939df195c2b4697a0d4c01a452b4b1f51e7d24b1dfa"} Apr 20 21:14:35.415649 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:35.415582 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gggf" event={"ID":"2d4d9854-af8f-42c6-928a-3bbb105e3f5a","Type":"ContainerStarted","Data":"0f4d17e634a02aa6a8dfacc7c415b3590226158398a1dc2fabcfe59c1234ce4b"} Apr 20 21:14:35.417007 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:35.416984 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56c5557f99-ktv6g" Apr 20 21:14:35.417149 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:35.417102 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" event={"ID":"777c3126-65de-4706-a59a-112f7ec7916a","Type":"ContainerStarted","Data":"d7bd08348604f8304fa9630c37909baea6a04d857f62135ebd918bc4037b8b49"} Apr 20 21:14:35.452065 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:35.452031 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56c5557f99-ktv6g"] Apr 20 21:14:35.457233 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:35.457206 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56c5557f99-ktv6g"] Apr 20 21:14:36.421209 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:36.421171 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gggf" event={"ID":"2d4d9854-af8f-42c6-928a-3bbb105e3f5a","Type":"ContainerStarted","Data":"edafd19e945a1a13f6bba810fd7b859b8ecc22afd9478d7a6279deb489b92df4"} Apr 20 21:14:36.422653 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:36.422615 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" event={"ID":"777c3126-65de-4706-a59a-112f7ec7916a","Type":"ContainerStarted","Data":"a46d81e1bc716a1d72af1e81ae079f8eb26abbd44e84903ca6fbdcd09ecad25e"} Apr 20 21:14:36.439539 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:36.439489 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-trkwb" podStartSLOduration=1.333154232 podStartE2EDuration="2.439472774s" podCreationTimestamp="2026-04-20 21:14:34 +0000 UTC" firstStartedPulling="2026-04-20 21:14:34.535922849 +0000 UTC m=+95.929108369" lastFinishedPulling="2026-04-20 21:14:35.642241392 +0000 UTC m=+97.035426911" observedRunningTime="2026-04-20 21:14:36.438266922 +0000 UTC m=+97.831452476" watchObservedRunningTime="2026-04-20 21:14:36.439472774 +0000 UTC m=+97.832658316" Apr 20 21:14:37.125234 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:37.125200 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b753347-4d97-4cf2-bf6d-2601352892a0" path="/var/lib/kubelet/pods/4b753347-4d97-4cf2-bf6d-2601352892a0/volumes" Apr 20 21:14:37.426758 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:37.426724 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9gggf" event={"ID":"2d4d9854-af8f-42c6-928a-3bbb105e3f5a","Type":"ContainerStarted","Data":"7fe4d95da6c311ec912381e5834bfb65064386f06f94923c8440f48713c54925"} Apr 20 21:14:37.444936 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:37.444894 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9gggf" podStartSLOduration=1.151834367 podStartE2EDuration="3.444881384s" podCreationTimestamp="2026-04-20 21:14:34 +0000 UTC" firstStartedPulling="2026-04-20 21:14:34.634636017 +0000 UTC m=+96.027821541" lastFinishedPulling="2026-04-20 21:14:36.927683039 +0000 UTC m=+98.320868558" observedRunningTime="2026-04-20 21:14:37.443717376 +0000 UTC m=+98.836902930" watchObservedRunningTime="2026-04-20 21:14:37.444881384 +0000 UTC m=+98.838066924" Apr 20 21:14:38.437272 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.437240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:14:38.437669 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.437307 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:14:38.439586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.439555 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ba83c57-a195-426d-ab9b-969d9434f8d7-metrics-tls\") pod \"dns-default-qq8hb\" (UID: \"9ba83c57-a195-426d-ab9b-969d9434f8d7\") " pod="openshift-dns/dns-default-qq8hb" Apr 20 21:14:38.439758 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.439736 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0f24ff9-e85c-47ac-9ee4-3deb25a046a8-cert\") pod \"ingress-canary-9n92s\" (UID: \"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8\") " pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:14:38.461128 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.461110 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtslr\"" Apr 20 21:14:38.469597 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.469555 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qq8hb" Apr 20 21:14:38.476362 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.476339 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kkzr9\"" Apr 20 21:14:38.484456 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.484403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9n92s" Apr 20 21:14:38.593743 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.593711 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qq8hb"] Apr 20 21:14:38.596327 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:38.596302 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ba83c57_a195_426d_ab9b_969d9434f8d7.slice/crio-2943d78fd57675303f3412e1022462f05dcccf32389cd447b9e87fb86f7bad61 WatchSource:0}: Error finding container 2943d78fd57675303f3412e1022462f05dcccf32389cd447b9e87fb86f7bad61: Status 404 returned error can't find the container with id 2943d78fd57675303f3412e1022462f05dcccf32389cd447b9e87fb86f7bad61 Apr 20 21:14:38.610477 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:38.610453 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9n92s"] Apr 20 21:14:38.612975 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:38.612955 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f24ff9_e85c_47ac_9ee4_3deb25a046a8.slice/crio-024c24566e1c195b93a1c1a5b10385f5596279c2fa31d88e7dd43ad6a3a6d5a4 WatchSource:0}: Error finding container 024c24566e1c195b93a1c1a5b10385f5596279c2fa31d88e7dd43ad6a3a6d5a4: Status 404 returned error can't find the container with id 024c24566e1c195b93a1c1a5b10385f5596279c2fa31d88e7dd43ad6a3a6d5a4 Apr 20 21:14:39.361644 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:39.361614 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7pqmr" Apr 20 21:14:39.434630 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:39.434444 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qq8hb" event={"ID":"9ba83c57-a195-426d-ab9b-969d9434f8d7","Type":"ContainerStarted","Data":"2943d78fd57675303f3412e1022462f05dcccf32389cd447b9e87fb86f7bad61"} Apr 20 21:14:39.435849 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:39.435818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9n92s" event={"ID":"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8","Type":"ContainerStarted","Data":"024c24566e1c195b93a1c1a5b10385f5596279c2fa31d88e7dd43ad6a3a6d5a4"} Apr 20 21:14:39.849439 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:39.849352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:39.852515 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:39.852465 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c8850e0-79f1-40dd-be01-35e964ad62be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-47gc7\" (UID: \"7c8850e0-79f1-40dd-be01-35e964ad62be\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:39.887260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:39.887232 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" Apr 20 21:14:40.833720 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:40.833690 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7"] Apr 20 21:14:40.839204 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:40.839181 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c8850e0_79f1_40dd_be01_35e964ad62be.slice/crio-0037eef9f05797e394650cabac99eb0aab1ca1bfe680e5f87c053255516fd172 WatchSource:0}: Error finding container 0037eef9f05797e394650cabac99eb0aab1ca1bfe680e5f87c053255516fd172: Status 404 returned error can't find the container with id 0037eef9f05797e394650cabac99eb0aab1ca1bfe680e5f87c053255516fd172 Apr 20 21:14:41.445011 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:41.444972 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9n92s" event={"ID":"a0f24ff9-e85c-47ac-9ee4-3deb25a046a8","Type":"ContainerStarted","Data":"c0fe6392ef586882be5ecb5e79836a0c2d720909a332a39ebec0d0767c50aa68"} Apr 20 21:14:41.446312 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:41.446277 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" event={"ID":"7c8850e0-79f1-40dd-be01-35e964ad62be","Type":"ContainerStarted","Data":"0037eef9f05797e394650cabac99eb0aab1ca1bfe680e5f87c053255516fd172"} Apr 20 21:14:41.448098 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:41.448073 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qq8hb" event={"ID":"9ba83c57-a195-426d-ab9b-969d9434f8d7","Type":"ContainerStarted","Data":"5f6ddccb99dd29b240297de76006f677a7a036df6877e1cca03ea96ab064198e"} Apr 20 21:14:41.448222 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:41.448102 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qq8hb" event={"ID":"9ba83c57-a195-426d-ab9b-969d9434f8d7","Type":"ContainerStarted","Data":"d5678453eb0ec0575504bdccb24126a298fc0edcacd226c3abecdfbe124bd30f"} Apr 20 21:14:41.448283 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:41.448265 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qq8hb" Apr 20 21:14:41.461550 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:41.461506 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9n92s" podStartSLOduration=65.369509934 podStartE2EDuration="1m7.46149166s" podCreationTimestamp="2026-04-20 21:13:34 +0000 UTC" firstStartedPulling="2026-04-20 21:14:38.614687817 +0000 UTC m=+100.007873337" lastFinishedPulling="2026-04-20 21:14:40.70666953 +0000 UTC m=+102.099855063" observedRunningTime="2026-04-20 21:14:41.460521958 +0000 UTC m=+102.853707500" watchObservedRunningTime="2026-04-20 21:14:41.46149166 +0000 UTC m=+102.854677203" Apr 20 21:14:41.478224 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:41.478114 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qq8hb" podStartSLOduration=65.374275136 podStartE2EDuration="1m7.478099561s" podCreationTimestamp="2026-04-20 21:13:34 +0000 UTC" firstStartedPulling="2026-04-20 21:14:38.598131639 +0000 UTC m=+99.991317157" lastFinishedPulling="2026-04-20 21:14:40.701956048 +0000 UTC m=+102.095141582" observedRunningTime="2026-04-20 21:14:41.477656465 +0000 UTC m=+102.870842007" watchObservedRunningTime="2026-04-20 21:14:41.478099561 +0000 UTC m=+102.871285103" Apr 20 21:14:43.164930 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.164897 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj"] Apr 20 21:14:43.167702 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.167679 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" Apr 20 21:14:43.169942 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.169918 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 21:14:43.170098 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.170083 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-864t4\"" Apr 20 21:14:43.175379 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.175358 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj"] Apr 20 21:14:43.272799 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.272775 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8f276464-4ba8-4641-8b8f-67e5405b3f5e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8nwtj\" (UID: \"8f276464-4ba8-4641-8b8f-67e5405b3f5e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" Apr 20 21:14:43.373451 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.373410 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8f276464-4ba8-4641-8b8f-67e5405b3f5e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8nwtj\" (UID: \"8f276464-4ba8-4641-8b8f-67e5405b3f5e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" Apr 20 21:14:43.373540 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:43.373530 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 21:14:43.373588 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:43.373579 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f276464-4ba8-4641-8b8f-67e5405b3f5e-tls-certificates podName:8f276464-4ba8-4641-8b8f-67e5405b3f5e nodeName:}" failed. No retries permitted until 2026-04-20 21:14:43.87356545 +0000 UTC m=+105.266750969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/8f276464-4ba8-4641-8b8f-67e5405b3f5e-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-8nwtj" (UID: "8f276464-4ba8-4641-8b8f-67e5405b3f5e") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 21:14:43.455938 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.455874 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" event={"ID":"7c8850e0-79f1-40dd-be01-35e964ad62be","Type":"ContainerStarted","Data":"6599ec51845672196f28c59ccfda9d28658f491215c946c5177db9d24e516ac5"} Apr 20 21:14:43.470836 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.470784 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-47gc7" podStartSLOduration=33.638783351 podStartE2EDuration="35.470768836s" podCreationTimestamp="2026-04-20 21:14:08 +0000 UTC" firstStartedPulling="2026-04-20 21:14:40.841746491 +0000 UTC m=+102.234932011" lastFinishedPulling="2026-04-20 21:14:42.673731974 +0000 UTC m=+104.066917496" observedRunningTime="2026-04-20 21:14:43.470672853 +0000 UTC m=+104.863858395" watchObservedRunningTime="2026-04-20 21:14:43.470768836 +0000 UTC m=+104.863954382" Apr 20 21:14:43.663769 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.663749 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-579df576b8-7lpnc"] Apr 20 21:14:43.666653 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.666638 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.669487 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.669467 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 21:14:43.669588 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.669575 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 21:14:43.669697 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.669682 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 21:14:43.674669 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.674651 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 21:14:43.674745 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.674659 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-69m2r\"" Apr 20 21:14:43.676462 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.676444 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 21:14:43.676511 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.676485 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 21:14:43.676742 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.676728 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 21:14:43.681745 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.681728 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579df576b8-7lpnc"] Apr 20 21:14:43.777153 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.777131 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-oauth-serving-cert\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.777237 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.777170 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-serving-cert\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.777301 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.777235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-oauth-config\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.777301 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.777271 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rszxg\" (UniqueName: \"kubernetes.io/projected/a253c448-b3d5-4789-8811-020aa486a4f9-kube-api-access-rszxg\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.777301 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.777296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-service-ca\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.777408 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.777323 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-console-config\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.877982 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.877960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8f276464-4ba8-4641-8b8f-67e5405b3f5e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8nwtj\" (UID: \"8f276464-4ba8-4641-8b8f-67e5405b3f5e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" Apr 20 21:14:43.878077 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.877989 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-oauth-config\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.878077 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.878008 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rszxg\" (UniqueName: \"kubernetes.io/projected/a253c448-b3d5-4789-8811-020aa486a4f9-kube-api-access-rszxg\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.878077 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.878026 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-service-ca\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.878077 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.878048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-console-config\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.878260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.878100 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-oauth-serving-cert\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.878260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.878132 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-serving-cert\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.882902 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.881288 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-serving-cert\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.882902 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.881350 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8f276464-4ba8-4641-8b8f-67e5405b3f5e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8nwtj\" (UID: \"8f276464-4ba8-4641-8b8f-67e5405b3f5e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" Apr 20 21:14:43.883853 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.883830 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-console-config\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.883929 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.883909 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-oauth-serving-cert\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.883979 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.883913 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-service-ca\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.883979 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.883939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-oauth-config\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.885673 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.885655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rszxg\" (UniqueName: \"kubernetes.io/projected/a253c448-b3d5-4789-8811-020aa486a4f9-kube-api-access-rszxg\") pod \"console-579df576b8-7lpnc\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:43.975074 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:43.975044 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:44.076719 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:44.076692 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" Apr 20 21:14:44.120005 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:44.119976 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579df576b8-7lpnc"] Apr 20 21:14:44.124699 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:44.124641 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda253c448_b3d5_4789_8811_020aa486a4f9.slice/crio-08e073180647cfe2b376975c6315a1f07791408610197e1d845ab6faf11a8145 WatchSource:0}: Error finding container 08e073180647cfe2b376975c6315a1f07791408610197e1d845ab6faf11a8145: Status 404 returned error can't find the container with id 08e073180647cfe2b376975c6315a1f07791408610197e1d845ab6faf11a8145 Apr 20 21:14:44.193259 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:44.193230 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj"] Apr 20 21:14:44.195703 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:44.195681 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f276464_4ba8_4641_8b8f_67e5405b3f5e.slice/crio-656633eab75bf305587018dfd09eac90be165b4d631774544041500ae00ab5db WatchSource:0}: Error finding container 656633eab75bf305587018dfd09eac90be165b4d631774544041500ae00ab5db: Status 404 returned error can't find the container with id 656633eab75bf305587018dfd09eac90be165b4d631774544041500ae00ab5db Apr 20 21:14:44.459974 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:44.459905 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" event={"ID":"8f276464-4ba8-4641-8b8f-67e5405b3f5e","Type":"ContainerStarted","Data":"656633eab75bf305587018dfd09eac90be165b4d631774544041500ae00ab5db"} Apr 20 21:14:44.460900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:44.460874 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579df576b8-7lpnc" event={"ID":"a253c448-b3d5-4789-8811-020aa486a4f9","Type":"ContainerStarted","Data":"08e073180647cfe2b376975c6315a1f07791408610197e1d845ab6faf11a8145"} Apr 20 21:14:46.468165 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:46.468132 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" event={"ID":"8f276464-4ba8-4641-8b8f-67e5405b3f5e","Type":"ContainerStarted","Data":"83b1c2d0c964e10ffa00e053e45b6829ba417e0f6375d1d62d04e491ef3ce2ac"} Apr 20 21:14:46.468650 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:46.468357 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" Apr 20 21:14:46.473832 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:46.473809 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" Apr 20 21:14:46.483816 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:46.483769 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8nwtj" podStartSLOduration=2.161794032 podStartE2EDuration="3.483757185s" podCreationTimestamp="2026-04-20 21:14:43 +0000 UTC" firstStartedPulling="2026-04-20 21:14:44.197392211 +0000 UTC m=+105.590577730" lastFinishedPulling="2026-04-20 21:14:45.519355363 +0000 UTC m=+106.912540883" observedRunningTime="2026-04-20 21:14:46.482658523 +0000 UTC m=+107.875844059" watchObservedRunningTime="2026-04-20 21:14:46.483757185 +0000 UTC m=+107.876942727" Apr 20 21:14:47.226795 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.226760 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d65z5"] Apr 20 21:14:47.230083 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.230065 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.232551 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.232530 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 21:14:47.233610 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.233583 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-ldr2c\"" Apr 20 21:14:47.233610 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.233595 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 21:14:47.233761 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.233668 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 21:14:47.239227 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.239206 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d65z5"] Apr 20 21:14:47.305538 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.305513 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.305632 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.305544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qms99\" (UniqueName: \"kubernetes.io/projected/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-kube-api-access-qms99\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.305632 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.305568 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.305632 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.305598 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.406762 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.406713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.406826 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.406766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.406826 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.406795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qms99\" (UniqueName: \"kubernetes.io/projected/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-kube-api-access-qms99\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.406826 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.406817 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.406955 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:47.406910 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 21:14:47.407003 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:47.406985 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-tls podName:11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a nodeName:}" failed. No retries permitted until 2026-04-20 21:14:47.906967043 +0000 UTC m=+109.300152562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-d65z5" (UID: "11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a") : secret "prometheus-operator-tls" not found Apr 20 21:14:47.407414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.407395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.408955 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.408933 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.415290 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.415269 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qms99\" (UniqueName: \"kubernetes.io/projected/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-kube-api-access-qms99\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.471636 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.471609 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579df576b8-7lpnc" event={"ID":"a253c448-b3d5-4789-8811-020aa486a4f9","Type":"ContainerStarted","Data":"fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c"} Apr 20 21:14:47.486595 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.486553 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-579df576b8-7lpnc" podStartSLOduration=1.5060115779999999 podStartE2EDuration="4.486538584s" podCreationTimestamp="2026-04-20 21:14:43 +0000 UTC" firstStartedPulling="2026-04-20 21:14:44.126228769 +0000 UTC m=+105.519414288" lastFinishedPulling="2026-04-20 21:14:47.106755762 +0000 UTC m=+108.499941294" observedRunningTime="2026-04-20 21:14:47.4859165 +0000 UTC m=+108.879102052" watchObservedRunningTime="2026-04-20 21:14:47.486538584 +0000 UTC m=+108.879724128" Apr 20 21:14:47.911265 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.911241 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:47.916277 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:47.916252 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d65z5\" (UID: \"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:48.138630 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:48.138604 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" Apr 20 21:14:48.255663 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:48.255584 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d65z5"] Apr 20 21:14:48.258027 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:48.257997 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11766c9d_c9d9_4a44_b8ad_cd8bd0c89f4a.slice/crio-11b56b509807d428c34b71e5e0d0214b9ad48e3e50a9df3b69052891c5b04281 WatchSource:0}: Error finding container 11b56b509807d428c34b71e5e0d0214b9ad48e3e50a9df3b69052891c5b04281: Status 404 returned error can't find the container with id 11b56b509807d428c34b71e5e0d0214b9ad48e3e50a9df3b69052891c5b04281 Apr 20 21:14:48.475219 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:48.475153 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" event={"ID":"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a","Type":"ContainerStarted","Data":"11b56b509807d428c34b71e5e0d0214b9ad48e3e50a9df3b69052891c5b04281"} Apr 20 21:14:50.481616 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:50.481585 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" event={"ID":"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a","Type":"ContainerStarted","Data":"f86c93edd9fb9d76c40b4e32c1d0438c95b6202ca69f37f9f0d4cf8bd3901ee2"} Apr 20 21:14:50.481616 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:50.481620 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" event={"ID":"11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a","Type":"ContainerStarted","Data":"762f3fa8d55584d3a18268772cd71c7cd811d51d823d9be5da68ba88787058cd"} Apr 20 21:14:50.498492 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:50.498449 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-d65z5" podStartSLOduration=1.935718372 podStartE2EDuration="3.498414195s" podCreationTimestamp="2026-04-20 21:14:47 +0000 UTC" firstStartedPulling="2026-04-20 21:14:48.259951576 +0000 UTC m=+109.653137109" lastFinishedPulling="2026-04-20 21:14:49.822647409 +0000 UTC m=+111.215832932" observedRunningTime="2026-04-20 21:14:50.497055738 +0000 UTC m=+111.890241279" watchObservedRunningTime="2026-04-20 21:14:50.498414195 +0000 UTC m=+111.891599736" Apr 20 21:14:51.453681 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:51.453651 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qq8hb" Apr 20 21:14:52.253253 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.253222 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8ddcc5ff7-n9ghj"] Apr 20 21:14:52.256575 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.256554 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.265698 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.265679 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 21:14:52.279516 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.279496 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8ddcc5ff7-n9ghj"] Apr 20 21:14:52.344432 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.344397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f44vb\" (UniqueName: \"kubernetes.io/projected/7d573021-c5bc-47b2-9ac6-42f44680ba76-kube-api-access-f44vb\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.344554 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.344517 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-oauth-config\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.344604 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.344573 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-config\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.344661 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.344619 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-serving-cert\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.344661 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.344648 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-trusted-ca-bundle\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.344766 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.344748 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-service-ca\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.344831 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.344807 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-oauth-serving-cert\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.445879 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.445856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-oauth-config\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446004 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.445886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-config\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446004 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.445907 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-serving-cert\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446004 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.445924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-trusted-ca-bundle\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446004 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.445958 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-service-ca\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446004 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.445978 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-oauth-serving-cert\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446256 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.446006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f44vb\" (UniqueName: \"kubernetes.io/projected/7d573021-c5bc-47b2-9ac6-42f44680ba76-kube-api-access-f44vb\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446706 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.446686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-service-ca\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446784 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.446685 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-config\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.446784 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.446753 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-oauth-serving-cert\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.447061 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.447036 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-trusted-ca-bundle\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.448257 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.448228 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-oauth-config\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.448531 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.448514 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-serving-cert\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.456216 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.456196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f44vb\" (UniqueName: \"kubernetes.io/projected/7d573021-c5bc-47b2-9ac6-42f44680ba76-kube-api-access-f44vb\") pod \"console-8ddcc5ff7-n9ghj\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.566553 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.566529 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:14:52.606207 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.606178 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jr5nj"] Apr 20 21:14:52.611039 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.611018 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.614817 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.614480 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 21:14:52.614817 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.614486 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 21:14:52.614817 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.614776 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-6b2gj\"" Apr 20 21:14:52.614817 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.614806 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 21:14:52.621746 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.621706 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jr5nj"] Apr 20 21:14:52.624566 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.624385 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5nqj6"] Apr 20 21:14:52.628105 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.628015 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.630667 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.630488 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 21:14:52.631725 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.631131 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hrxkm\"" Apr 20 21:14:52.631847 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.631829 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 21:14:52.632392 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.632166 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 21:14:52.704396 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.704373 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8ddcc5ff7-n9ghj"] Apr 20 21:14:52.706679 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:52.706655 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d573021_c5bc_47b2_9ac6_42f44680ba76.slice/crio-5a809a517a6a2ef4e0bb807ae5179bfe970c34a4a8b6a229b321057edec92d23 WatchSource:0}: Error finding container 5a809a517a6a2ef4e0bb807ae5179bfe970c34a4a8b6a229b321057edec92d23: Status 404 returned error can't find the container with id 5a809a517a6a2ef4e0bb807ae5179bfe970c34a4a8b6a229b321057edec92d23 Apr 20 21:14:52.747366 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747340 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.747482 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-root\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.747482 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747411 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-textfile\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.747584 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747487 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-tls\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.747584 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45874952-f43e-4c25-9d03-c35a06b5dbbd-metrics-client-ca\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.747684 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747619 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-accelerators-collector-config\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.747684 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747649 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5b9v\" (UniqueName: \"kubernetes.io/projected/45874952-f43e-4c25-9d03-c35a06b5dbbd-kube-api-access-k5b9v\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.747684 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747675 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.747816 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747700 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-wtmp\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.747816 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747726 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f17ccda7-2ca6-4bc8-b586-635850795b77-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.747816 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747755 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.747816 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747777 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-sys\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.747998 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747833 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f17ccda7-2ca6-4bc8-b586-635850795b77-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.747998 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747876 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnpzj\" (UniqueName: \"kubernetes.io/projected/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-api-access-rnpzj\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.747998 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.747913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.848605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.848520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.848605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.848563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.848605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.848589 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-root\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.848856 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.848619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-textfile\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.848856 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.848644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-tls\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.848856 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.848726 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-root\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.848856 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:52.848827 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 21:14:52.849074 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:52.848904 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-tls podName:45874952-f43e-4c25-9d03-c35a06b5dbbd nodeName:}" failed. No retries permitted until 2026-04-20 21:14:53.348883301 +0000 UTC m=+114.742068825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-tls") pod "node-exporter-5nqj6" (UID: "45874952-f43e-4c25-9d03-c35a06b5dbbd") : secret "node-exporter-tls" not found Apr 20 21:14:52.849074 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.848993 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-textfile\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.849186 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.849069 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45874952-f43e-4c25-9d03-c35a06b5dbbd-metrics-client-ca\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.849186 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.849130 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-accelerators-collector-config\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.849186 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.849155 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5b9v\" (UniqueName: \"kubernetes.io/projected/45874952-f43e-4c25-9d03-c35a06b5dbbd-kube-api-access-k5b9v\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.849488 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.849182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.849676 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:52.849588 2567 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 21:14:52.849676 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:52.849644 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-tls podName:f17ccda7-2ca6-4bc8-b586-635850795b77 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:53.349626996 +0000 UTC m=+114.742812531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-jr5nj" (UID: "f17ccda7-2ca6-4bc8-b586-635850795b77") : secret "kube-state-metrics-tls" not found Apr 20 21:14:52.849865 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.849762 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45874952-f43e-4c25-9d03-c35a06b5dbbd-metrics-client-ca\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.849865 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.849784 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-accelerators-collector-config\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.849865 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.849793 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-wtmp\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.850058 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850033 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-wtmp\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.850210 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850100 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f17ccda7-2ca6-4bc8-b586-635850795b77-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.850532 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850511 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f17ccda7-2ca6-4bc8-b586-635850795b77-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.850614 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.850754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850665 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-sys\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.850754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f17ccda7-2ca6-4bc8-b586-635850795b77-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.850754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850729 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45874952-f43e-4c25-9d03-c35a06b5dbbd-sys\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.850754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850742 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnpzj\" (UniqueName: \"kubernetes.io/projected/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-api-access-rnpzj\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.850952 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.850805 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.851251 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.851233 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.851407 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.851388 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f17ccda7-2ca6-4bc8-b586-635850795b77-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.851493 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.851475 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:52.856763 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.856744 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5b9v\" (UniqueName: \"kubernetes.io/projected/45874952-f43e-4c25-9d03-c35a06b5dbbd-kube-api-access-k5b9v\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:52.857612 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:52.857597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnpzj\" (UniqueName: \"kubernetes.io/projected/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-api-access-rnpzj\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:53.354822 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.354793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-tls\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:53.355308 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.354846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:53.357272 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.357248 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/45874952-f43e-4c25-9d03-c35a06b5dbbd-node-exporter-tls\") pod \"node-exporter-5nqj6\" (UID: \"45874952-f43e-4c25-9d03-c35a06b5dbbd\") " pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:53.357272 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.357267 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f17ccda7-2ca6-4bc8-b586-635850795b77-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jr5nj\" (UID: \"f17ccda7-2ca6-4bc8-b586-635850795b77\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:53.494956 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.494928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8ddcc5ff7-n9ghj" event={"ID":"7d573021-c5bc-47b2-9ac6-42f44680ba76","Type":"ContainerStarted","Data":"fd3f2dd05ded4adf7234b69b4dc6bf5abd3db9abaf875dff096c2c076c9a668c"} Apr 20 21:14:53.495069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.494960 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8ddcc5ff7-n9ghj" event={"ID":"7d573021-c5bc-47b2-9ac6-42f44680ba76","Type":"ContainerStarted","Data":"5a809a517a6a2ef4e0bb807ae5179bfe970c34a4a8b6a229b321057edec92d23"} Apr 20 21:14:53.515121 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.515084 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8ddcc5ff7-n9ghj" podStartSLOduration=1.515070808 podStartE2EDuration="1.515070808s" podCreationTimestamp="2026-04-20 21:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:14:53.514399857 +0000 UTC m=+114.907585399" watchObservedRunningTime="2026-04-20 21:14:53.515070808 +0000 UTC m=+114.908256350" Apr 20 21:14:53.525731 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.525710 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" Apr 20 21:14:53.540364 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.540343 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5nqj6" Apr 20 21:14:53.550059 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:53.550038 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45874952_f43e_4c25_9d03_c35a06b5dbbd.slice/crio-362ea294534032cddb5da58c3b8b81c9cbc71318adbdf2c8b099e37cecd275ee WatchSource:0}: Error finding container 362ea294534032cddb5da58c3b8b81c9cbc71318adbdf2c8b099e37cecd275ee: Status 404 returned error can't find the container with id 362ea294534032cddb5da58c3b8b81c9cbc71318adbdf2c8b099e37cecd275ee Apr 20 21:14:53.664068 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.664016 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jr5nj"] Apr 20 21:14:53.666655 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:53.666633 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf17ccda7_2ca6_4bc8_b586_635850795b77.slice/crio-9b86b64d23917f1f084234a70b30b2c06a1d0d995ec2648364987b9132acd56d WatchSource:0}: Error finding container 9b86b64d23917f1f084234a70b30b2c06a1d0d995ec2648364987b9132acd56d: Status 404 returned error can't find the container with id 9b86b64d23917f1f084234a70b30b2c06a1d0d995ec2648364987b9132acd56d Apr 20 21:14:53.687524 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.687502 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:14:53.692443 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.692411 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.694732 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.694711 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 21:14:53.694826 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.694748 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 21:14:53.694826 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.694767 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 21:14:53.694826 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.694798 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 21:14:53.695291 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.695118 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 21:14:53.695291 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.695160 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kgbrw\"" Apr 20 21:14:53.695291 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.695184 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 21:14:53.695291 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.695184 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 21:14:53.695291 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.695258 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 21:14:53.695601 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.695548 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 21:14:53.708637 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.708614 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:14:53.859627 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859604 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-config-volume\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.859747 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859633 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.859747 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-web-config\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.859747 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859709 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfm4k\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-kube-api-access-tfm4k\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.859848 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859751 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.859848 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859778 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.859848 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859811 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-config-out\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.859848 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859828 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.860009 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859846 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.860009 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859883 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.860009 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859909 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.860009 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.859967 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.860159 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.860018 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961468 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961390 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961468 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961492 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-config-volume\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961591 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961792 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961626 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-web-config\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961792 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961665 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfm4k\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-kube-api-access-tfm4k\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961792 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961792 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.961792 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961784 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-config-out\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.962036 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.962036 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.961835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.962881 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.962854 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.962993 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:53.962961 2567 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 21:14:53.963055 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:14:53.963014 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls podName:6175a76e-94a4-4756-8404-b83648817d50 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:54.462996908 +0000 UTC m=+115.856182431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "6175a76e-94a4-4756-8404-b83648817d50") : secret "alertmanager-main-tls" not found Apr 20 21:14:53.963842 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.963816 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.964541 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.964515 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.964852 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.964673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-config-volume\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.964852 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.964781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.964852 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.964802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.965054 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.964985 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.965166 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.965146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.965206 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.965153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.965441 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.965405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-web-config\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.966018 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.966003 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-config-out\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.969510 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.969488 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfm4k\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-kube-api-access-tfm4k\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:53.975376 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.975361 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:53.975441 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.975390 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:53.980195 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:53.980177 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:54.467439 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:54.467327 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:54.470569 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:54.470320 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:54.501027 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:54.500753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5nqj6" event={"ID":"45874952-f43e-4c25-9d03-c35a06b5dbbd","Type":"ContainerStarted","Data":"65bc0365b9fa3f0b8dd0bf7f1fddf37ffdea81c3a30ca16aa6f991d26277acca"} Apr 20 21:14:54.501027 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:54.500799 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5nqj6" event={"ID":"45874952-f43e-4c25-9d03-c35a06b5dbbd","Type":"ContainerStarted","Data":"362ea294534032cddb5da58c3b8b81c9cbc71318adbdf2c8b099e37cecd275ee"} Apr 20 21:14:54.502565 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:54.502451 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" event={"ID":"f17ccda7-2ca6-4bc8-b586-635850795b77","Type":"ContainerStarted","Data":"9b86b64d23917f1f084234a70b30b2c06a1d0d995ec2648364987b9132acd56d"} Apr 20 21:14:54.507348 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:54.507310 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:14:54.604650 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:54.604626 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:14:54.746876 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:54.746847 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:14:55.001493 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:55.001389 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6175a76e_94a4_4756_8404_b83648817d50.slice/crio-254e02ab31daa8b8cf334a6f97de6f836b639d4ee4fc88aee47a87c5b812f634 WatchSource:0}: Error finding container 254e02ab31daa8b8cf334a6f97de6f836b639d4ee4fc88aee47a87c5b812f634: Status 404 returned error can't find the container with id 254e02ab31daa8b8cf334a6f97de6f836b639d4ee4fc88aee47a87c5b812f634 Apr 20 21:14:55.506181 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:55.506148 2567 generic.go:358] "Generic (PLEG): container finished" podID="45874952-f43e-4c25-9d03-c35a06b5dbbd" containerID="65bc0365b9fa3f0b8dd0bf7f1fddf37ffdea81c3a30ca16aa6f991d26277acca" exitCode=0 Apr 20 21:14:55.506699 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:55.506224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5nqj6" event={"ID":"45874952-f43e-4c25-9d03-c35a06b5dbbd","Type":"ContainerDied","Data":"65bc0365b9fa3f0b8dd0bf7f1fddf37ffdea81c3a30ca16aa6f991d26277acca"} Apr 20 21:14:55.508081 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:55.508047 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" event={"ID":"f17ccda7-2ca6-4bc8-b586-635850795b77","Type":"ContainerStarted","Data":"c0820a86c24d37ecc0fc2bda60d9efb2c0eec788a83ae98b8da276053c60d6cc"} Apr 20 21:14:55.508194 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:55.508088 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" event={"ID":"f17ccda7-2ca6-4bc8-b586-635850795b77","Type":"ContainerStarted","Data":"ffd1a5f9eb247eb7a6d9a30c6cc96485bb945b1ed3d974eb64a2e87ee85bafb1"} Apr 20 21:14:55.508194 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:55.508101 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" event={"ID":"f17ccda7-2ca6-4bc8-b586-635850795b77","Type":"ContainerStarted","Data":"0d88e407bef2a2f450cc7f238b2b53480eef0e8ef26956159760de526127e71a"} Apr 20 21:14:55.509143 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:55.509112 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerStarted","Data":"254e02ab31daa8b8cf334a6f97de6f836b639d4ee4fc88aee47a87c5b812f634"} Apr 20 21:14:55.545237 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:55.545123 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jr5nj" podStartSLOduration=2.168482683 podStartE2EDuration="3.54510363s" podCreationTimestamp="2026-04-20 21:14:52 +0000 UTC" firstStartedPulling="2026-04-20 21:14:53.668704816 +0000 UTC m=+115.061890335" lastFinishedPulling="2026-04-20 21:14:55.045325749 +0000 UTC m=+116.438511282" observedRunningTime="2026-04-20 21:14:55.54476477 +0000 UTC m=+116.937950312" watchObservedRunningTime="2026-04-20 21:14:55.54510363 +0000 UTC m=+116.938289172" Apr 20 21:14:56.513505 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.513404 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5nqj6" event={"ID":"45874952-f43e-4c25-9d03-c35a06b5dbbd","Type":"ContainerStarted","Data":"dbe1fd0137f69e9dff558ac523b64fb2e75a95bf5926d7a1756988679a8b2b0a"} Apr 20 21:14:56.513505 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.513465 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5nqj6" event={"ID":"45874952-f43e-4c25-9d03-c35a06b5dbbd","Type":"ContainerStarted","Data":"448ea7fb6552fb9bc7f249417ce5bf03b84c1812484593913ca9f86935c675d9"} Apr 20 21:14:56.514784 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.514757 2567 generic.go:358] "Generic (PLEG): container finished" podID="6175a76e-94a4-4756-8404-b83648817d50" containerID="d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876" exitCode=0 Apr 20 21:14:56.514860 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.514839 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerDied","Data":"d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876"} Apr 20 21:14:56.563010 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.562969 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5nqj6" podStartSLOduration=3.755052266 podStartE2EDuration="4.562954081s" podCreationTimestamp="2026-04-20 21:14:52 +0000 UTC" firstStartedPulling="2026-04-20 21:14:53.552476941 +0000 UTC m=+114.945662475" lastFinishedPulling="2026-04-20 21:14:54.360378739 +0000 UTC m=+115.753564290" observedRunningTime="2026-04-20 21:14:56.533884758 +0000 UTC m=+117.927070299" watchObservedRunningTime="2026-04-20 21:14:56.562954081 +0000 UTC m=+117.956139622" Apr 20 21:14:56.928736 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.928700 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6f8955cdbb-mw27t"] Apr 20 21:14:56.932232 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.932212 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:56.934882 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.934859 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 21:14:56.935042 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.934859 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 21:14:56.935042 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.935004 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-21shgj68oac3b\"" Apr 20 21:14:56.935257 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.935161 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 21:14:56.935320 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.935292 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 21:14:56.935795 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.935781 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-7w8q6\"" Apr 20 21:14:56.940371 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:56.940351 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f8955cdbb-mw27t"] Apr 20 21:14:57.092977 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.092941 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/af82c672-3cf2-41b2-9e5c-7784bf46eec5-metrics-server-audit-profiles\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.093160 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.092995 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af82c672-3cf2-41b2-9e5c-7784bf46eec5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.093160 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.093092 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-secret-metrics-server-tls\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.093160 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.093156 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/af82c672-3cf2-41b2-9e5c-7784bf46eec5-audit-log\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.093306 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.093179 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjwrg\" (UniqueName: \"kubernetes.io/projected/af82c672-3cf2-41b2-9e5c-7784bf46eec5-kube-api-access-jjwrg\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.093306 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.093243 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-client-ca-bundle\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.093306 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.093275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-secret-metrics-server-client-certs\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.194584 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.194496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-client-ca-bundle\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.195125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.194544 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-secret-metrics-server-client-certs\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.195125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.195086 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/af82c672-3cf2-41b2-9e5c-7784bf46eec5-metrics-server-audit-profiles\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.195282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.195139 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af82c672-3cf2-41b2-9e5c-7784bf46eec5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.195282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.195192 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-secret-metrics-server-tls\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.195282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.195233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/af82c672-3cf2-41b2-9e5c-7784bf46eec5-audit-log\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.195282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.195264 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjwrg\" (UniqueName: \"kubernetes.io/projected/af82c672-3cf2-41b2-9e5c-7784bf46eec5-kube-api-access-jjwrg\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.200156 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.196109 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/af82c672-3cf2-41b2-9e5c-7784bf46eec5-audit-log\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.200156 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.196500 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af82c672-3cf2-41b2-9e5c-7784bf46eec5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.200156 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.197024 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/af82c672-3cf2-41b2-9e5c-7784bf46eec5-metrics-server-audit-profiles\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.200156 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.198130 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-secret-metrics-server-client-certs\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.201454 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.201410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-secret-metrics-server-tls\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.201616 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.201591 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af82c672-3cf2-41b2-9e5c-7784bf46eec5-client-ca-bundle\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.203752 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.203734 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjwrg\" (UniqueName: \"kubernetes.io/projected/af82c672-3cf2-41b2-9e5c-7784bf46eec5-kube-api-access-jjwrg\") pod \"metrics-server-6f8955cdbb-mw27t\" (UID: \"af82c672-3cf2-41b2-9e5c-7784bf46eec5\") " pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.241707 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.241682 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:14:57.384337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.384308 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f8955cdbb-mw27t"] Apr 20 21:14:57.386934 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:57.386903 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf82c672_3cf2_41b2_9e5c_7784bf46eec5.slice/crio-e995e6d30aa7de186dca17bd678137aa62372689b95812828eae9c96cdf33d53 WatchSource:0}: Error finding container e995e6d30aa7de186dca17bd678137aa62372689b95812828eae9c96cdf33d53: Status 404 returned error can't find the container with id e995e6d30aa7de186dca17bd678137aa62372689b95812828eae9c96cdf33d53 Apr 20 21:14:57.472367 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.472292 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8ddcc5ff7-n9ghj"] Apr 20 21:14:57.519164 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:57.519130 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" event={"ID":"af82c672-3cf2-41b2-9e5c-7784bf46eec5","Type":"ContainerStarted","Data":"e995e6d30aa7de186dca17bd678137aa62372689b95812828eae9c96cdf33d53"} Apr 20 21:14:58.525340 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.525296 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerStarted","Data":"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279"} Apr 20 21:14:58.525764 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.525344 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerStarted","Data":"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc"} Apr 20 21:14:58.525764 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.525358 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerStarted","Data":"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f"} Apr 20 21:14:58.525764 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.525371 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerStarted","Data":"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff"} Apr 20 21:14:58.525764 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.525384 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerStarted","Data":"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4"} Apr 20 21:14:58.779976 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.779633 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:14:58.785602 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.785568 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.788319 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.788048 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 21:14:58.788319 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.788081 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 21:14:58.788319 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.788121 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 21:14:58.788319 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.788217 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-gh5on71p28ep\"" Apr 20 21:14:58.788619 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.788414 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hmc7t\"" Apr 20 21:14:58.788619 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.788444 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 21:14:58.791435 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.790097 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 21:14:58.791435 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.790156 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 21:14:58.791435 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.790376 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 21:14:58.791435 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.790672 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 21:14:58.791435 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.790696 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 21:14:58.791435 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.791006 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 21:14:58.791435 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.791006 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 21:14:58.792977 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.792540 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 21:14:58.795953 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.795931 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:14:58.910947 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.910910 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.910958 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.910995 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911054 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911103 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911329 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911125 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqp6l\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-kube-api-access-xqp6l\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911329 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911183 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911329 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911212 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911329 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911306 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911329 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911330 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911346 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911369 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911404 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911461 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911492 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911546 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:58.911841 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:58.911589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014395 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014445 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014492 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014641 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqp6l\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-kube-api-access-xqp6l\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.015163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.014894 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.017227 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018603 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018617 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018669 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-gh5on71p28ep\"" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018670 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018883 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.018898 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.019362 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.019899 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.019383 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 21:14:59.020914 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.020889 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.021333 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.021116 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 21:14:59.021333 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.021137 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 21:14:59.021642 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.021624 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 21:14:59.021767 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.021749 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 21:14:59.021830 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.021817 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 21:14:59.022176 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.022151 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 21:14:59.022261 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.022221 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 21:14:59.022324 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.022224 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 21:14:59.024189 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.024166 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 21:14:59.027828 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.027806 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.028410 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.028386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqp6l\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-kube-api-access-xqp6l\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.030105 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.030054 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.030453 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.030433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.031096 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.031078 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 21:14:59.031667 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.031527 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.037131 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.036898 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.037131 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.036970 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.037131 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.037087 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.037269 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.037218 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.038417 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.038394 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.039081 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.039060 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.039159 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.039132 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.039286 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.039270 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.039664 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.039647 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.102892 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.102870 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hmc7t\"" Apr 20 21:14:59.111747 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.111723 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:14:59.278146 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.278113 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:14:59.278493 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:14:59.278465 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ff3fa9_677b_493a_8ab9_5937fa0cf5c8.slice/crio-de991db9c277569677819caecf187e369a94f88a9621db089c4c82a163229c0f WatchSource:0}: Error finding container de991db9c277569677819caecf187e369a94f88a9621db089c4c82a163229c0f: Status 404 returned error can't find the container with id de991db9c277569677819caecf187e369a94f88a9621db089c4c82a163229c0f Apr 20 21:14:59.529074 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.529048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" event={"ID":"af82c672-3cf2-41b2-9e5c-7784bf46eec5","Type":"ContainerStarted","Data":"ed97d35e057f98779ad407379c1d28a89cc97fbabec63bbce909fc26cd11ed6d"} Apr 20 21:14:59.530545 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.530518 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerID="266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575" exitCode=0 Apr 20 21:14:59.530643 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.530605 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerDied","Data":"266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575"} Apr 20 21:14:59.530700 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.530641 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerStarted","Data":"de991db9c277569677819caecf187e369a94f88a9621db089c4c82a163229c0f"} Apr 20 21:14:59.546849 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:14:59.546802 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" podStartSLOduration=1.8372468880000001 podStartE2EDuration="3.546785998s" podCreationTimestamp="2026-04-20 21:14:56 +0000 UTC" firstStartedPulling="2026-04-20 21:14:57.389084393 +0000 UTC m=+118.782269911" lastFinishedPulling="2026-04-20 21:14:59.098623502 +0000 UTC m=+120.491809021" observedRunningTime="2026-04-20 21:14:59.545243492 +0000 UTC m=+120.938429034" watchObservedRunningTime="2026-04-20 21:14:59.546785998 +0000 UTC m=+120.939971540" Apr 20 21:15:00.538287 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:00.538247 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerStarted","Data":"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514"} Apr 20 21:15:00.568222 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:00.568168 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.071332405 podStartE2EDuration="7.568153375s" podCreationTimestamp="2026-04-20 21:14:53 +0000 UTC" firstStartedPulling="2026-04-20 21:14:55.003247872 +0000 UTC m=+116.396433391" lastFinishedPulling="2026-04-20 21:14:59.500068842 +0000 UTC m=+120.893254361" observedRunningTime="2026-04-20 21:15:00.566655701 +0000 UTC m=+121.959841281" watchObservedRunningTime="2026-04-20 21:15:00.568153375 +0000 UTC m=+121.961338915" Apr 20 21:15:02.545637 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:02.545603 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerStarted","Data":"f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09"} Apr 20 21:15:02.545637 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:02.545642 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerStarted","Data":"8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7"} Apr 20 21:15:02.567000 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:02.566970 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:15:03.877573 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.877537 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-567b4d7d79-cdjlc"] Apr 20 21:15:03.882959 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.882931 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:03.887043 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.887017 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-567b4d7d79-cdjlc"] Apr 20 21:15:03.971483 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.971397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-trusted-ca-bundle\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:03.971651 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.971519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-serving-cert\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:03.971651 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.971581 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2ld\" (UniqueName: \"kubernetes.io/projected/b234f8c2-080c-4bbc-9324-43e9f84adc34-kube-api-access-kq2ld\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:03.971651 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.971623 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-oauth-config\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:03.971810 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.971679 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-service-ca\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:03.971810 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.971724 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-config\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:03.971810 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:03.971770 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-oauth-serving-cert\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.072633 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.072598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-service-ca\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.072810 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.072652 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-config\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.072810 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.072686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-oauth-serving-cert\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.072810 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.072715 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-trusted-ca-bundle\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.072980 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.072897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-serving-cert\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.072980 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.072965 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2ld\" (UniqueName: \"kubernetes.io/projected/b234f8c2-080c-4bbc-9324-43e9f84adc34-kube-api-access-kq2ld\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.073084 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.073015 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-oauth-config\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.073491 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.073322 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-service-ca\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.073491 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.073352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-config\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.073491 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.073382 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-oauth-serving-cert\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.073715 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.073506 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-trusted-ca-bundle\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.076065 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.076041 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-oauth-config\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.076240 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.076220 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-serving-cert\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.080781 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.080739 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2ld\" (UniqueName: \"kubernetes.io/projected/b234f8c2-080c-4bbc-9324-43e9f84adc34-kube-api-access-kq2ld\") pod \"console-567b4d7d79-cdjlc\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.197327 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.197297 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:04.340207 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.340181 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-567b4d7d79-cdjlc"] Apr 20 21:15:04.342461 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:15:04.342411 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb234f8c2_080c_4bbc_9324_43e9f84adc34.slice/crio-24417bba11e280acce65e8c4bd31accea85def40911520f8f47bdd3ed5ba7097 WatchSource:0}: Error finding container 24417bba11e280acce65e8c4bd31accea85def40911520f8f47bdd3ed5ba7097: Status 404 returned error can't find the container with id 24417bba11e280acce65e8c4bd31accea85def40911520f8f47bdd3ed5ba7097 Apr 20 21:15:04.553215 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.553171 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567b4d7d79-cdjlc" event={"ID":"b234f8c2-080c-4bbc-9324-43e9f84adc34","Type":"ContainerStarted","Data":"de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b"} Apr 20 21:15:04.553389 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.553221 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567b4d7d79-cdjlc" event={"ID":"b234f8c2-080c-4bbc-9324-43e9f84adc34","Type":"ContainerStarted","Data":"24417bba11e280acce65e8c4bd31accea85def40911520f8f47bdd3ed5ba7097"} Apr 20 21:15:04.555851 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.555821 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerStarted","Data":"2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030"} Apr 20 21:15:04.555943 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.555854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerStarted","Data":"e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a"} Apr 20 21:15:04.555943 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.555871 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerStarted","Data":"92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b"} Apr 20 21:15:04.555943 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.555886 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerStarted","Data":"4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8"} Apr 20 21:15:04.571263 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.571220 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-567b4d7d79-cdjlc" podStartSLOduration=1.571207679 podStartE2EDuration="1.571207679s" podCreationTimestamp="2026-04-20 21:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:15:04.570002063 +0000 UTC m=+125.963187617" watchObservedRunningTime="2026-04-20 21:15:04.571207679 +0000 UTC m=+125.964393219" Apr 20 21:15:04.596044 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:04.596004 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.035115661 podStartE2EDuration="6.595992889s" podCreationTimestamp="2026-04-20 21:14:58 +0000 UTC" firstStartedPulling="2026-04-20 21:14:59.531595768 +0000 UTC m=+120.924781286" lastFinishedPulling="2026-04-20 21:15:04.09247298 +0000 UTC m=+125.485658514" observedRunningTime="2026-04-20 21:15:04.595558458 +0000 UTC m=+125.988744000" watchObservedRunningTime="2026-04-20 21:15:04.595992889 +0000 UTC m=+125.989178426" Apr 20 21:15:08.915476 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:08.915414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:15:08.917806 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:08.917778 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec3d4534-0f04-46f4-8eae-d37ac21ac0c6-metrics-certs\") pod \"network-metrics-daemon-6p5ds\" (UID: \"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6\") " pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:15:09.111998 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:09.111971 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:15:09.141325 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:09.141304 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lxd5q\"" Apr 20 21:15:09.149669 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:09.149652 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6p5ds" Apr 20 21:15:09.265693 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:09.265670 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6p5ds"] Apr 20 21:15:09.268089 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:15:09.268058 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec3d4534_0f04_46f4_8eae_d37ac21ac0c6.slice/crio-9bfb017195ec2a2cdf9277288c6c005ea5c6cad6fa4e6656a71f8a4d42f5c0b0 WatchSource:0}: Error finding container 9bfb017195ec2a2cdf9277288c6c005ea5c6cad6fa4e6656a71f8a4d42f5c0b0: Status 404 returned error can't find the container with id 9bfb017195ec2a2cdf9277288c6c005ea5c6cad6fa4e6656a71f8a4d42f5c0b0 Apr 20 21:15:09.573380 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:09.573349 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6p5ds" event={"ID":"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6","Type":"ContainerStarted","Data":"9bfb017195ec2a2cdf9277288c6c005ea5c6cad6fa4e6656a71f8a4d42f5c0b0"} Apr 20 21:15:10.578556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:10.578524 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6p5ds" event={"ID":"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6","Type":"ContainerStarted","Data":"00ffce37e2e7b84526cacd721d7a484928b58ec1cf14ad3612cfe27eadb5bbd1"} Apr 20 21:15:10.578556 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:10.578559 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6p5ds" event={"ID":"ec3d4534-0f04-46f4-8eae-d37ac21ac0c6","Type":"ContainerStarted","Data":"7c9e0ebbb7a94367b21147a41b644a627912021a2eabcc4df42ee05624017d24"} Apr 20 21:15:10.593765 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:10.593725 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6p5ds" podStartSLOduration=130.677258308 podStartE2EDuration="2m11.593712045s" podCreationTimestamp="2026-04-20 21:12:59 +0000 UTC" firstStartedPulling="2026-04-20 21:15:09.269923667 +0000 UTC m=+130.663109186" lastFinishedPulling="2026-04-20 21:15:10.186377389 +0000 UTC m=+131.579562923" observedRunningTime="2026-04-20 21:15:10.592379333 +0000 UTC m=+131.985564876" watchObservedRunningTime="2026-04-20 21:15:10.593712045 +0000 UTC m=+131.986897586" Apr 20 21:15:14.197995 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:14.197958 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:14.197995 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:14.198002 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:14.202796 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:14.202768 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:14.595500 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:14.595476 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:15:14.643562 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:14.643535 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-579df576b8-7lpnc"] Apr 20 21:15:17.245331 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:17.245292 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:15:17.245879 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:17.245849 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:15:22.492754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.492686 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8ddcc5ff7-n9ghj" podUID="7d573021-c5bc-47b2-9ac6-42f44680ba76" containerName="console" containerID="cri-o://fd3f2dd05ded4adf7234b69b4dc6bf5abd3db9abaf875dff096c2c076c9a668c" gracePeriod=15 Apr 20 21:15:22.618111 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.618088 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8ddcc5ff7-n9ghj_7d573021-c5bc-47b2-9ac6-42f44680ba76/console/0.log" Apr 20 21:15:22.618214 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.618130 2567 generic.go:358] "Generic (PLEG): container finished" podID="7d573021-c5bc-47b2-9ac6-42f44680ba76" containerID="fd3f2dd05ded4adf7234b69b4dc6bf5abd3db9abaf875dff096c2c076c9a668c" exitCode=2 Apr 20 21:15:22.618214 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.618174 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8ddcc5ff7-n9ghj" event={"ID":"7d573021-c5bc-47b2-9ac6-42f44680ba76","Type":"ContainerDied","Data":"fd3f2dd05ded4adf7234b69b4dc6bf5abd3db9abaf875dff096c2c076c9a668c"} Apr 20 21:15:22.744510 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.744459 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8ddcc5ff7-n9ghj_7d573021-c5bc-47b2-9ac6-42f44680ba76/console/0.log" Apr 20 21:15:22.744605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.744533 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:15:22.825851 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.825826 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-config\") pod \"7d573021-c5bc-47b2-9ac6-42f44680ba76\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " Apr 20 21:15:22.825966 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.825866 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-oauth-config\") pod \"7d573021-c5bc-47b2-9ac6-42f44680ba76\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " Apr 20 21:15:22.825966 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.825910 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-oauth-serving-cert\") pod \"7d573021-c5bc-47b2-9ac6-42f44680ba76\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " Apr 20 21:15:22.825966 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.825933 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-serving-cert\") pod \"7d573021-c5bc-47b2-9ac6-42f44680ba76\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " Apr 20 21:15:22.826125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.825976 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f44vb\" (UniqueName: \"kubernetes.io/projected/7d573021-c5bc-47b2-9ac6-42f44680ba76-kube-api-access-f44vb\") pod \"7d573021-c5bc-47b2-9ac6-42f44680ba76\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " Apr 20 21:15:22.826125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.826028 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-service-ca\") pod \"7d573021-c5bc-47b2-9ac6-42f44680ba76\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " Apr 20 21:15:22.826125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.826053 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-trusted-ca-bundle\") pod \"7d573021-c5bc-47b2-9ac6-42f44680ba76\" (UID: \"7d573021-c5bc-47b2-9ac6-42f44680ba76\") " Apr 20 21:15:22.826281 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.826212 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-config" (OuterVolumeSpecName: "console-config") pod "7d573021-c5bc-47b2-9ac6-42f44680ba76" (UID: "7d573021-c5bc-47b2-9ac6-42f44680ba76"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:22.826336 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.826278 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7d573021-c5bc-47b2-9ac6-42f44680ba76" (UID: "7d573021-c5bc-47b2-9ac6-42f44680ba76"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:22.826336 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.826293 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:22.826602 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.826560 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-service-ca" (OuterVolumeSpecName: "service-ca") pod "7d573021-c5bc-47b2-9ac6-42f44680ba76" (UID: "7d573021-c5bc-47b2-9ac6-42f44680ba76"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:22.826602 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.826568 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7d573021-c5bc-47b2-9ac6-42f44680ba76" (UID: "7d573021-c5bc-47b2-9ac6-42f44680ba76"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:22.829047 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.829023 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7d573021-c5bc-47b2-9ac6-42f44680ba76" (UID: "7d573021-c5bc-47b2-9ac6-42f44680ba76"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:15:22.829047 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.829042 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7d573021-c5bc-47b2-9ac6-42f44680ba76" (UID: "7d573021-c5bc-47b2-9ac6-42f44680ba76"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:15:22.829172 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.829061 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d573021-c5bc-47b2-9ac6-42f44680ba76-kube-api-access-f44vb" (OuterVolumeSpecName: "kube-api-access-f44vb") pod "7d573021-c5bc-47b2-9ac6-42f44680ba76" (UID: "7d573021-c5bc-47b2-9ac6-42f44680ba76"). InnerVolumeSpecName "kube-api-access-f44vb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:15:22.927266 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.927242 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-service-ca\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:22.927266 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.927266 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-trusted-ca-bundle\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:22.927391 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.927277 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-oauth-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:22.927391 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.927288 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d573021-c5bc-47b2-9ac6-42f44680ba76-oauth-serving-cert\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:22.927391 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.927298 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d573021-c5bc-47b2-9ac6-42f44680ba76-console-serving-cert\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:22.927391 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:22.927307 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f44vb\" (UniqueName: \"kubernetes.io/projected/7d573021-c5bc-47b2-9ac6-42f44680ba76-kube-api-access-f44vb\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:23.623141 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:23.623118 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8ddcc5ff7-n9ghj_7d573021-c5bc-47b2-9ac6-42f44680ba76/console/0.log" Apr 20 21:15:23.623514 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:23.623218 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8ddcc5ff7-n9ghj" event={"ID":"7d573021-c5bc-47b2-9ac6-42f44680ba76","Type":"ContainerDied","Data":"5a809a517a6a2ef4e0bb807ae5179bfe970c34a4a8b6a229b321057edec92d23"} Apr 20 21:15:23.623514 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:23.623233 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8ddcc5ff7-n9ghj" Apr 20 21:15:23.623514 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:23.623251 2567 scope.go:117] "RemoveContainer" containerID="fd3f2dd05ded4adf7234b69b4dc6bf5abd3db9abaf875dff096c2c076c9a668c" Apr 20 21:15:23.640775 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:23.640750 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8ddcc5ff7-n9ghj"] Apr 20 21:15:23.644004 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:23.643983 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8ddcc5ff7-n9ghj"] Apr 20 21:15:25.125857 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:25.125822 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d573021-c5bc-47b2-9ac6-42f44680ba76" path="/var/lib/kubelet/pods/7d573021-c5bc-47b2-9ac6-42f44680ba76/volumes" Apr 20 21:15:37.246981 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:37.246911 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:15:37.251040 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:37.251017 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6f8955cdbb-mw27t" Apr 20 21:15:39.662413 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:39.662375 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-579df576b8-7lpnc" podUID="a253c448-b3d5-4789-8811-020aa486a4f9" containerName="console" containerID="cri-o://fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c" gracePeriod=15 Apr 20 21:15:39.904043 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:39.904020 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-579df576b8-7lpnc_a253c448-b3d5-4789-8811-020aa486a4f9/console/0.log" Apr 20 21:15:39.904166 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:39.904092 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:15:40.057403 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.057376 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-serving-cert\") pod \"a253c448-b3d5-4789-8811-020aa486a4f9\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " Apr 20 21:15:40.057569 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.057497 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-service-ca\") pod \"a253c448-b3d5-4789-8811-020aa486a4f9\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " Apr 20 21:15:40.057569 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.057536 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rszxg\" (UniqueName: \"kubernetes.io/projected/a253c448-b3d5-4789-8811-020aa486a4f9-kube-api-access-rszxg\") pod \"a253c448-b3d5-4789-8811-020aa486a4f9\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " Apr 20 21:15:40.057569 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.057562 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-oauth-serving-cert\") pod \"a253c448-b3d5-4789-8811-020aa486a4f9\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " Apr 20 21:15:40.057743 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.057605 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-oauth-config\") pod \"a253c448-b3d5-4789-8811-020aa486a4f9\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " Apr 20 21:15:40.057743 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.057629 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-console-config\") pod \"a253c448-b3d5-4789-8811-020aa486a4f9\" (UID: \"a253c448-b3d5-4789-8811-020aa486a4f9\") " Apr 20 21:15:40.058005 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.057979 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-service-ca" (OuterVolumeSpecName: "service-ca") pod "a253c448-b3d5-4789-8811-020aa486a4f9" (UID: "a253c448-b3d5-4789-8811-020aa486a4f9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:40.058123 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.058102 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-console-config" (OuterVolumeSpecName: "console-config") pod "a253c448-b3d5-4789-8811-020aa486a4f9" (UID: "a253c448-b3d5-4789-8811-020aa486a4f9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:40.058178 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.058100 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a253c448-b3d5-4789-8811-020aa486a4f9" (UID: "a253c448-b3d5-4789-8811-020aa486a4f9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:15:40.059590 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.059567 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a253c448-b3d5-4789-8811-020aa486a4f9" (UID: "a253c448-b3d5-4789-8811-020aa486a4f9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:15:40.059686 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.059611 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a253c448-b3d5-4789-8811-020aa486a4f9-kube-api-access-rszxg" (OuterVolumeSpecName: "kube-api-access-rszxg") pod "a253c448-b3d5-4789-8811-020aa486a4f9" (UID: "a253c448-b3d5-4789-8811-020aa486a4f9"). InnerVolumeSpecName "kube-api-access-rszxg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:15:40.059724 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.059700 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a253c448-b3d5-4789-8811-020aa486a4f9" (UID: "a253c448-b3d5-4789-8811-020aa486a4f9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:15:40.158396 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.158364 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-oauth-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:40.158396 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.158397 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-console-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:40.158396 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.158407 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a253c448-b3d5-4789-8811-020aa486a4f9-console-serving-cert\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:40.158578 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.158416 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-service-ca\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:40.158578 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.158443 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rszxg\" (UniqueName: \"kubernetes.io/projected/a253c448-b3d5-4789-8811-020aa486a4f9-kube-api-access-rszxg\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:40.158578 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.158452 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a253c448-b3d5-4789-8811-020aa486a4f9-oauth-serving-cert\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:15:40.680586 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.680563 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-579df576b8-7lpnc_a253c448-b3d5-4789-8811-020aa486a4f9/console/0.log" Apr 20 21:15:40.680953 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.680602 2567 generic.go:358] "Generic (PLEG): container finished" podID="a253c448-b3d5-4789-8811-020aa486a4f9" containerID="fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c" exitCode=2 Apr 20 21:15:40.680953 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.680671 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579df576b8-7lpnc" Apr 20 21:15:40.680953 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.680692 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579df576b8-7lpnc" event={"ID":"a253c448-b3d5-4789-8811-020aa486a4f9","Type":"ContainerDied","Data":"fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c"} Apr 20 21:15:40.680953 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.680731 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579df576b8-7lpnc" event={"ID":"a253c448-b3d5-4789-8811-020aa486a4f9","Type":"ContainerDied","Data":"08e073180647cfe2b376975c6315a1f07791408610197e1d845ab6faf11a8145"} Apr 20 21:15:40.680953 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.680748 2567 scope.go:117] "RemoveContainer" containerID="fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c" Apr 20 21:15:40.690085 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.690063 2567 scope.go:117] "RemoveContainer" containerID="fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c" Apr 20 21:15:40.690376 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:15:40.690358 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c\": container with ID starting with fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c not found: ID does not exist" containerID="fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c" Apr 20 21:15:40.690539 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.690382 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c"} err="failed to get container status \"fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c\": rpc error: code = NotFound desc = could not find container \"fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c\": container with ID starting with fc1dbea24fba09e8957b06eb0fbbd3d689b0a2ce299caf3fed63dfc2eb8dd67c not found: ID does not exist" Apr 20 21:15:40.701469 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.701411 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-579df576b8-7lpnc"] Apr 20 21:15:40.706611 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:40.706586 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-579df576b8-7lpnc"] Apr 20 21:15:41.126566 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:41.126542 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a253c448-b3d5-4789-8811-020aa486a4f9" path="/var/lib/kubelet/pods/a253c448-b3d5-4789-8811-020aa486a4f9/volumes" Apr 20 21:15:47.705076 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:47.705042 2567 generic.go:358] "Generic (PLEG): container finished" podID="b4535173-97a4-4c0b-aba0-a435bd525510" containerID="efb1190a603d65c3dca9a7e90692f8b180bd2759fe08e171a8af311c51e1d5be" exitCode=0 Apr 20 21:15:47.705415 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:47.705116 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" event={"ID":"b4535173-97a4-4c0b-aba0-a435bd525510","Type":"ContainerDied","Data":"efb1190a603d65c3dca9a7e90692f8b180bd2759fe08e171a8af311c51e1d5be"} Apr 20 21:15:47.705415 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:47.705415 2567 scope.go:117] "RemoveContainer" containerID="efb1190a603d65c3dca9a7e90692f8b180bd2759fe08e171a8af311c51e1d5be" Apr 20 21:15:48.710523 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:48.710486 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sfxjw" event={"ID":"b4535173-97a4-4c0b-aba0-a435bd525510","Type":"ContainerStarted","Data":"329f24da14e370943cfb760170f13c4a2b14b98140e18c14bb892cfd1944623e"} Apr 20 21:15:59.112408 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:59.112379 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:15:59.130980 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:59.130960 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:15:59.764155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:15:59.764128 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:12.906551 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:12.906515 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:16:12.906974 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:12.906929 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="alertmanager" containerID="cri-o://c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4" gracePeriod=120 Apr 20 21:16:12.907033 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:12.906990 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy-metric" containerID="cri-o://bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279" gracePeriod=120 Apr 20 21:16:12.907033 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:12.907003 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy-web" containerID="cri-o://6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f" gracePeriod=120 Apr 20 21:16:12.907137 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:12.907040 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="config-reloader" containerID="cri-o://ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff" gracePeriod=120 Apr 20 21:16:12.907137 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:12.907072 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy" containerID="cri-o://7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc" gracePeriod=120 Apr 20 21:16:12.907137 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:12.907078 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="prom-label-proxy" containerID="cri-o://61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514" gracePeriod=120 Apr 20 21:16:13.795903 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:13.795866 2567 generic.go:358] "Generic (PLEG): container finished" podID="6175a76e-94a4-4756-8404-b83648817d50" containerID="61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514" exitCode=0 Apr 20 21:16:13.795903 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:13.795897 2567 generic.go:358] "Generic (PLEG): container finished" podID="6175a76e-94a4-4756-8404-b83648817d50" containerID="7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc" exitCode=0 Apr 20 21:16:13.795903 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:13.795908 2567 generic.go:358] "Generic (PLEG): container finished" podID="6175a76e-94a4-4756-8404-b83648817d50" containerID="ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff" exitCode=0 Apr 20 21:16:13.796141 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:13.795915 2567 generic.go:358] "Generic (PLEG): container finished" podID="6175a76e-94a4-4756-8404-b83648817d50" containerID="c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4" exitCode=0 Apr 20 21:16:13.796141 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:13.795943 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerDied","Data":"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514"} Apr 20 21:16:13.796141 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:13.795980 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerDied","Data":"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc"} Apr 20 21:16:13.796141 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:13.795990 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerDied","Data":"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff"} Apr 20 21:16:13.796141 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:13.795999 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerDied","Data":"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4"} Apr 20 21:16:14.151503 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.151479 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:14.221014 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.220986 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-web\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221027 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-main-db\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221055 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-trusted-ca-bundle\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221090 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-tls-assets\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221113 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-config-out\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221155 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221144 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-metrics-client-ca\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221169 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-config-volume\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221204 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-metric\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221238 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-cluster-tls-config\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221272 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221305 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221329 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfm4k\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-kube-api-access-tfm4k\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221357 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-web-config\") pod \"6175a76e-94a4-4756-8404-b83648817d50\" (UID: \"6175a76e-94a4-4756-8404-b83648817d50\") " Apr 20 21:16:14.221414 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221357 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:16:14.221793 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221532 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:14.221793 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221695 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.221793 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.221714 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-alertmanager-main-db\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.223293 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.223240 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:14.224363 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.224324 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:16:14.224612 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.224569 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:14.224896 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.224868 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:14.225650 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.225568 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-config-out" (OuterVolumeSpecName: "config-out") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:16:14.225650 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.225581 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-kube-api-access-tfm4k" (OuterVolumeSpecName: "kube-api-access-tfm4k") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "kube-api-access-tfm4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:16:14.225650 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.225586 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-config-volume" (OuterVolumeSpecName: "config-volume") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:14.225928 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.225901 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:14.226141 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.226117 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:14.229592 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.229379 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:14.236394 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.236366 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-web-config" (OuterVolumeSpecName: "web-config") pod "6175a76e-94a4-4756-8404-b83648817d50" (UID: "6175a76e-94a4-4756-8404-b83648817d50"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:14.323094 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323043 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-tls-assets\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323094 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323073 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6175a76e-94a4-4756-8404-b83648817d50-config-out\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323094 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323089 2567 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6175a76e-94a4-4756-8404-b83648817d50-metrics-client-ca\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323102 2567 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-config-volume\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323116 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323130 2567 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-cluster-tls-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323142 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-main-tls\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323155 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323169 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfm4k\" (UniqueName: \"kubernetes.io/projected/6175a76e-94a4-4756-8404-b83648817d50-kube-api-access-tfm4k\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323209 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-web-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.323244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.323223 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6175a76e-94a4-4756-8404-b83648817d50-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:14.801791 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.801761 2567 generic.go:358] "Generic (PLEG): container finished" podID="6175a76e-94a4-4756-8404-b83648817d50" containerID="bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279" exitCode=0 Apr 20 21:16:14.801791 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.801788 2567 generic.go:358] "Generic (PLEG): container finished" podID="6175a76e-94a4-4756-8404-b83648817d50" containerID="6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f" exitCode=0 Apr 20 21:16:14.801968 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.801824 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerDied","Data":"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279"} Apr 20 21:16:14.801968 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.801861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerDied","Data":"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f"} Apr 20 21:16:14.801968 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.801876 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6175a76e-94a4-4756-8404-b83648817d50","Type":"ContainerDied","Data":"254e02ab31daa8b8cf334a6f97de6f836b639d4ee4fc88aee47a87c5b812f634"} Apr 20 21:16:14.801968 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.801861 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 21:16:14.801968 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.801933 2567 scope.go:117] "RemoveContainer" containerID="61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514" Apr 20 21:16:14.810144 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.810128 2567 scope.go:117] "RemoveContainer" containerID="bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279" Apr 20 21:16:14.817123 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.817104 2567 scope.go:117] "RemoveContainer" containerID="7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc" Apr 20 21:16:14.823221 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.823204 2567 scope.go:117] "RemoveContainer" containerID="6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f" Apr 20 21:16:14.829260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.829237 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:16:14.830011 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.829992 2567 scope.go:117] "RemoveContainer" containerID="ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff" Apr 20 21:16:14.836066 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.836051 2567 scope.go:117] "RemoveContainer" containerID="c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4" Apr 20 21:16:14.843040 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.843024 2567 scope.go:117] "RemoveContainer" containerID="d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876" Apr 20 21:16:14.843433 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.843400 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 21:16:14.849771 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.849755 2567 scope.go:117] "RemoveContainer" containerID="61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514" Apr 20 21:16:14.849999 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:14.849984 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514\": container with ID starting with 61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514 not found: ID does not exist" containerID="61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514" Apr 20 21:16:14.850033 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.850006 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514"} err="failed to get container status \"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514\": rpc error: code = NotFound desc = could not find container \"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514\": container with ID starting with 61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514 not found: ID does not exist" Apr 20 21:16:14.850033 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.850022 2567 scope.go:117] "RemoveContainer" containerID="bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279" Apr 20 21:16:14.850209 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:14.850196 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279\": container with ID starting with bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279 not found: ID does not exist" containerID="bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279" Apr 20 21:16:14.850242 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.850214 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279"} err="failed to get container status \"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279\": rpc error: code = NotFound desc = could not find container \"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279\": container with ID starting with bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279 not found: ID does not exist" Apr 20 21:16:14.850242 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.850226 2567 scope.go:117] "RemoveContainer" containerID="7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc" Apr 20 21:16:14.852584 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:14.852556 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc\": container with ID starting with 7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc not found: ID does not exist" containerID="7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc" Apr 20 21:16:14.852679 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.852592 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc"} err="failed to get container status \"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc\": rpc error: code = NotFound desc = could not find container \"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc\": container with ID starting with 7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc not found: ID does not exist" Apr 20 21:16:14.852679 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.852615 2567 scope.go:117] "RemoveContainer" containerID="6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f" Apr 20 21:16:14.853048 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:14.853029 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f\": container with ID starting with 6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f not found: ID does not exist" containerID="6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f" Apr 20 21:16:14.853126 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.853055 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f"} err="failed to get container status \"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f\": rpc error: code = NotFound desc = could not find container \"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f\": container with ID starting with 6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f not found: ID does not exist" Apr 20 21:16:14.853126 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.853079 2567 scope.go:117] "RemoveContainer" containerID="ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff" Apr 20 21:16:14.853508 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:14.853486 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff\": container with ID starting with ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff not found: ID does not exist" containerID="ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff" Apr 20 21:16:14.853591 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.853508 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff"} err="failed to get container status \"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff\": rpc error: code = NotFound desc = could not find container \"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff\": container with ID starting with ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff not found: ID does not exist" Apr 20 21:16:14.853591 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.853522 2567 scope.go:117] "RemoveContainer" containerID="c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4" Apr 20 21:16:14.853755 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:14.853741 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4\": container with ID starting with c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4 not found: ID does not exist" containerID="c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4" Apr 20 21:16:14.853794 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.853757 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4"} err="failed to get container status \"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4\": rpc error: code = NotFound desc = could not find container \"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4\": container with ID starting with c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4 not found: ID does not exist" Apr 20 21:16:14.853794 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.853772 2567 scope.go:117] "RemoveContainer" containerID="d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876" Apr 20 21:16:14.853988 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:14.853969 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876\": container with ID starting with d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876 not found: ID does not exist" containerID="d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876" Apr 20 21:16:14.854051 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.853997 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876"} err="failed to get container status \"d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876\": rpc error: code = NotFound desc = could not find container \"d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876\": container with ID starting with d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876 not found: ID does not exist" Apr 20 21:16:14.854051 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854023 2567 scope.go:117] "RemoveContainer" containerID="61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514" Apr 20 21:16:14.854267 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854248 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514"} err="failed to get container status \"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514\": rpc error: code = NotFound desc = could not find container \"61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514\": container with ID starting with 61aa1aa913bec4f7444312c51da9e09edc608cb1ab2c64b66a7f61cd0ec8d514 not found: ID does not exist" Apr 20 21:16:14.854314 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854268 2567 scope.go:117] "RemoveContainer" containerID="bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279" Apr 20 21:16:14.854517 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854498 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279"} err="failed to get container status \"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279\": rpc error: code = NotFound desc = could not find container \"bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279\": container with ID starting with bdfd8f57f2137d29ec464ae4f3f8b9db25e691e45c50c3e84226496a43cfa279 not found: ID does not exist" Apr 20 21:16:14.854572 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854517 2567 scope.go:117] "RemoveContainer" containerID="7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc" Apr 20 21:16:14.854719 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854702 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc"} err="failed to get container status \"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc\": rpc error: code = NotFound desc = could not find container \"7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc\": container with ID starting with 7ddaecc154e6a0e1efc307b7b40661a194a2ea65147e3655c3a0441cb54991cc not found: ID does not exist" Apr 20 21:16:14.854783 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854721 2567 scope.go:117] "RemoveContainer" containerID="6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f" Apr 20 21:16:14.854941 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854924 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f"} err="failed to get container status \"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f\": rpc error: code = NotFound desc = could not find container \"6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f\": container with ID starting with 6d3b639d5af87fae0b2cab82f0f398cd5740b5f11669601ef92945d4981a7f4f not found: ID does not exist" Apr 20 21:16:14.854990 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.854941 2567 scope.go:117] "RemoveContainer" containerID="ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff" Apr 20 21:16:14.855159 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.855140 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff"} err="failed to get container status \"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff\": rpc error: code = NotFound desc = could not find container \"ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff\": container with ID starting with ac78991d42f5c40a6f389544808f84484dffec9294cc5854321675a158347fff not found: ID does not exist" Apr 20 21:16:14.855202 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.855160 2567 scope.go:117] "RemoveContainer" containerID="c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4" Apr 20 21:16:14.855329 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.855315 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4"} err="failed to get container status \"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4\": rpc error: code = NotFound desc = could not find container \"c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4\": container with ID starting with c3f6e4f6f6ac24775f0f7e9e3ae5214696be823657abec42ba495f84ff8c74c4 not found: ID does not exist" Apr 20 21:16:14.855373 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.855330 2567 scope.go:117] "RemoveContainer" containerID="d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876" Apr 20 21:16:14.855519 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:14.855503 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876"} err="failed to get container status \"d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876\": rpc error: code = NotFound desc = could not find container \"d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876\": container with ID starting with d7cbad0747b5975813843f2d85cd0e008b26ad5a218953e29b36d3e1b3a46876 not found: ID does not exist" Apr 20 21:16:15.126214 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:15.126152 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6175a76e-94a4-4756-8404-b83648817d50" path="/var/lib/kubelet/pods/6175a76e-94a4-4756-8404-b83648817d50/volumes" Apr 20 21:16:17.140834 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.140798 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:16:17.141332 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.141226 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="prometheus" containerID="cri-o://8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7" gracePeriod=600 Apr 20 21:16:17.141332 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.141262 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="thanos-sidecar" containerID="cri-o://4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8" gracePeriod=600 Apr 20 21:16:17.141462 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.141346 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="config-reloader" containerID="cri-o://f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09" gracePeriod=600 Apr 20 21:16:17.141462 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.141374 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy-web" containerID="cri-o://92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b" gracePeriod=600 Apr 20 21:16:17.141577 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.141477 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy-thanos" containerID="cri-o://2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030" gracePeriod=600 Apr 20 21:16:17.141630 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.141575 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy" containerID="cri-o://e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a" gracePeriod=600 Apr 20 21:16:17.814135 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814099 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerID="2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030" exitCode=0 Apr 20 21:16:17.814135 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814122 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerID="e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a" exitCode=0 Apr 20 21:16:17.814135 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814130 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerID="4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8" exitCode=0 Apr 20 21:16:17.814135 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814137 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerID="f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09" exitCode=0 Apr 20 21:16:17.814135 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814142 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerID="8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7" exitCode=0 Apr 20 21:16:17.814439 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814174 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerDied","Data":"2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030"} Apr 20 21:16:17.814439 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814217 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerDied","Data":"e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a"} Apr 20 21:16:17.814439 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814233 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerDied","Data":"4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8"} Apr 20 21:16:17.814439 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814247 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerDied","Data":"f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09"} Apr 20 21:16:17.814439 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:17.814261 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerDied","Data":"8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7"} Apr 20 21:16:18.401954 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.401933 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.452794 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.452762 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.452926 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.452821 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-metrics-client-ca\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.452926 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.452857 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.452926 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.452891 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqp6l\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-kube-api-access-xqp6l\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453122 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.452936 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config-out\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453122 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.452969 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-tls-assets\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453122 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.452994 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-tls\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453122 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453027 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-grpc-tls\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453122 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453051 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-web-config\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453122 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453077 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-trusted-ca-bundle\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453122 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453117 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-kube-rbac-proxy\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453143 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453178 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-kubelet-serving-ca-bundle\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453220 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-thanos-prometheus-http-client-file\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453251 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-serving-certs-ca-bundle\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453278 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-rulefiles-0\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453307 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-db\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453352 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-metrics-client-certs\") pod \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\" (UID: \"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8\") " Apr 20 21:16:18.453457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453404 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:18.453829 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.453663 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-metrics-client-ca\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.454370 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.454149 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:18.454925 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.454671 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:18.455376 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.455154 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:18.456446 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.455904 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:16:18.457078 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.457040 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:18.457160 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.457092 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:16:18.457685 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.457655 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.457764 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.457686 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.457835 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.457814 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.457907 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.457859 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.458215 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.458189 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.458398 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.458375 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.458666 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.458639 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-kube-api-access-xqp6l" (OuterVolumeSpecName: "kube-api-access-xqp6l") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "kube-api-access-xqp6l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:16:18.458798 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.458774 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config-out" (OuterVolumeSpecName: "config-out") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:16:18.459640 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.459615 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config" (OuterVolumeSpecName: "config") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.459904 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.459886 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.470304 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.470279 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-web-config" (OuterVolumeSpecName: "web-config") pod "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" (UID: "f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:18.554827 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554807 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config-out\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554827 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554825 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-tls-assets\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554835 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-tls\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554845 2567 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-grpc-tls\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554853 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-web-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554861 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-trusted-ca-bundle\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554869 2567 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-kube-rbac-proxy\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554878 2567 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554887 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554897 2567 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-thanos-prometheus-http-client-file\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554905 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554914 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554923 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-prometheus-k8s-db\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554932 2567 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-metrics-client-certs\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.554937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554941 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.555327 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554950 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.555327 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.554959 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqp6l\" (UniqueName: \"kubernetes.io/projected/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8-kube-api-access-xqp6l\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:16:18.820332 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.820306 2567 generic.go:358] "Generic (PLEG): container finished" podID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerID="92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b" exitCode=0 Apr 20 21:16:18.820453 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.820399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerDied","Data":"92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b"} Apr 20 21:16:18.820500 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.820451 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.820500 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.820471 2567 scope.go:117] "RemoveContainer" containerID="2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030" Apr 20 21:16:18.820617 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.820459 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8","Type":"ContainerDied","Data":"de991db9c277569677819caecf187e369a94f88a9621db089c4c82a163229c0f"} Apr 20 21:16:18.828696 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.828678 2567 scope.go:117] "RemoveContainer" containerID="e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a" Apr 20 21:16:18.835844 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.835824 2567 scope.go:117] "RemoveContainer" containerID="92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b" Apr 20 21:16:18.842464 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.842440 2567 scope.go:117] "RemoveContainer" containerID="4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8" Apr 20 21:16:18.844615 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.844588 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:16:18.849327 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.849301 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:16:18.850815 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.850787 2567 scope.go:117] "RemoveContainer" containerID="f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09" Apr 20 21:16:18.856951 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.856925 2567 scope.go:117] "RemoveContainer" containerID="8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7" Apr 20 21:16:18.863716 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.863701 2567 scope.go:117] "RemoveContainer" containerID="266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575" Apr 20 21:16:18.869731 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.869716 2567 scope.go:117] "RemoveContainer" containerID="2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030" Apr 20 21:16:18.870061 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:18.870034 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030\": container with ID starting with 2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030 not found: ID does not exist" containerID="2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030" Apr 20 21:16:18.870161 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.870073 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030"} err="failed to get container status \"2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030\": rpc error: code = NotFound desc = could not find container \"2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030\": container with ID starting with 2a08d6d38e9e4e2114bf277fcbe6a761391f7d1e03920ad88102f8c5c766c030 not found: ID does not exist" Apr 20 21:16:18.870161 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.870096 2567 scope.go:117] "RemoveContainer" containerID="e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a" Apr 20 21:16:18.870846 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:18.870714 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a\": container with ID starting with e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a not found: ID does not exist" containerID="e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a" Apr 20 21:16:18.870846 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.870749 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a"} err="failed to get container status \"e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a\": rpc error: code = NotFound desc = could not find container \"e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a\": container with ID starting with e98faecb9a3db794290720b88827aef9174cf4331c320e37c4c0e2cfecb83a6a not found: ID does not exist" Apr 20 21:16:18.870846 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.870773 2567 scope.go:117] "RemoveContainer" containerID="92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b" Apr 20 21:16:18.871105 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:18.871080 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b\": container with ID starting with 92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b not found: ID does not exist" containerID="92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b" Apr 20 21:16:18.871306 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.871111 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b"} err="failed to get container status \"92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b\": rpc error: code = NotFound desc = could not find container \"92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b\": container with ID starting with 92739cf2187f0212e3d34a4d055f3c95fcd0e231962cd448345e787ca299a44b not found: ID does not exist" Apr 20 21:16:18.871306 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.871130 2567 scope.go:117] "RemoveContainer" containerID="4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8" Apr 20 21:16:18.871536 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:18.871402 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8\": container with ID starting with 4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8 not found: ID does not exist" containerID="4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8" Apr 20 21:16:18.871536 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.871448 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8"} err="failed to get container status \"4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8\": rpc error: code = NotFound desc = could not find container \"4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8\": container with ID starting with 4e02a94849ddb8edf2ace59d3483b3fa169fa10e9f082b6739849c9d0e684bd8 not found: ID does not exist" Apr 20 21:16:18.871536 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.871471 2567 scope.go:117] "RemoveContainer" containerID="f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09" Apr 20 21:16:18.871728 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:18.871708 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09\": container with ID starting with f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09 not found: ID does not exist" containerID="f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09" Apr 20 21:16:18.871780 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.871733 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09"} err="failed to get container status \"f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09\": rpc error: code = NotFound desc = could not find container \"f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09\": container with ID starting with f61f9d99ebd01792c3807fe2d54f78471b8aaf1b8e0e0e0f95d8c7b5c4441a09 not found: ID does not exist" Apr 20 21:16:18.871780 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.871755 2567 scope.go:117] "RemoveContainer" containerID="8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7" Apr 20 21:16:18.872060 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:18.872025 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7\": container with ID starting with 8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7 not found: ID does not exist" containerID="8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7" Apr 20 21:16:18.872109 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872059 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7"} err="failed to get container status \"8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7\": rpc error: code = NotFound desc = could not find container \"8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7\": container with ID starting with 8a7268736b35cdba1eb37b5b056d4e06461626dead707d6fc59082ea3ff388c7 not found: ID does not exist" Apr 20 21:16:18.872109 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872079 2567 scope.go:117] "RemoveContainer" containerID="266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575" Apr 20 21:16:18.872197 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872178 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:16:18.872363 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:18.872344 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575\": container with ID starting with 266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575 not found: ID does not exist" containerID="266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575" Apr 20 21:16:18.872417 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872372 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575"} err="failed to get container status \"266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575\": rpc error: code = NotFound desc = could not find container \"266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575\": container with ID starting with 266f28ff98988a0c812255fe55a55e622131eeb224cd7a29e4499309820e4575 not found: ID does not exist" Apr 20 21:16:18.872513 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872495 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a253c448-b3d5-4789-8811-020aa486a4f9" containerName="console" Apr 20 21:16:18.872513 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872514 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a253c448-b3d5-4789-8811-020aa486a4f9" containerName="console" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872526 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="alertmanager" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872535 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="alertmanager" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872543 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy-web" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872551 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy-web" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872566 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872574 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872592 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="init-config-reloader" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872600 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="init-config-reloader" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872613 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="prom-label-proxy" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872621 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="prom-label-proxy" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872630 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy-thanos" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872639 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy-thanos" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872649 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d573021-c5bc-47b2-9ac6-42f44680ba76" containerName="console" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872657 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d573021-c5bc-47b2-9ac6-42f44680ba76" containerName="console" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872667 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="config-reloader" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872674 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="config-reloader" Apr 20 21:16:18.872678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872684 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy-web" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872692 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy-web" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872700 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="prometheus" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872709 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="prometheus" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872717 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="thanos-sidecar" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872726 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="thanos-sidecar" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872738 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="init-config-reloader" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872746 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="init-config-reloader" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872762 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy-metric" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872771 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy-metric" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872780 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="config-reloader" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872788 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="config-reloader" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872803 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872811 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872889 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="alertmanager" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872902 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy-metric" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872911 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="config-reloader" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872921 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d573021-c5bc-47b2-9ac6-42f44680ba76" containerName="console" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872931 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="config-reloader" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872943 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="prom-label-proxy" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872962 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872972 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy-web" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872982 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy-web" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.872991 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6175a76e-94a4-4756-8404-b83648817d50" containerName="kube-rbac-proxy" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.873000 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="prometheus" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.873009 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="thanos-sidecar" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.873019 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a253c448-b3d5-4789-8811-020aa486a4f9" containerName="console" Apr 20 21:16:18.873457 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.873029 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" containerName="kube-rbac-proxy-thanos" Apr 20 21:16:18.879915 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.879893 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.882297 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.882278 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 21:16:18.882399 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.882345 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-gh5on71p28ep\"" Apr 20 21:16:18.882399 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.882367 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 21:16:18.882693 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.882675 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hmc7t\"" Apr 20 21:16:18.882747 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.882691 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 21:16:18.882949 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.882932 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 21:16:18.883370 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.883354 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 21:16:18.883484 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.883467 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 21:16:18.883603 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.883586 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 21:16:18.883603 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.883598 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 21:16:18.883838 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.883818 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 21:16:18.883937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.883927 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 21:16:18.885854 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.885832 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 21:16:18.889485 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.889466 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 21:16:18.890263 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.890243 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:16:18.957972 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.957947 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.957975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.957990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5tc2\" (UniqueName: \"kubernetes.io/projected/74012b25-4329-404e-b0c9-0194ff3c8ce9-kube-api-access-c5tc2\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958030 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/74012b25-4329-404e-b0c9-0194ff3c8ce9-config-out\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958082 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-web-config\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958099 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958127 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958170 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-config\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958224 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958257 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958517 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958278 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958517 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958295 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958517 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958311 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958517 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958327 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958517 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958343 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:18.958517 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:18.958373 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/74012b25-4329-404e-b0c9-0194ff3c8ce9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.059505 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.059484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-web-config\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.059587 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.059511 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.059587 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.059538 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.059587 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.059575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.060791 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.060764 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.060890 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.059706 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-config\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.060890 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.060864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061021 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.060920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061021 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.060975 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061021 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061017 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061175 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061050 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061175 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061175 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061115 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061175 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061159 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/74012b25-4329-404e-b0c9-0194ff3c8ce9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061363 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061207 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061363 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061363 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5tc2\" (UniqueName: \"kubernetes.io/projected/74012b25-4329-404e-b0c9-0194ff3c8ce9-kube-api-access-c5tc2\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061363 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061324 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/74012b25-4329-404e-b0c9-0194ff3c8ce9-config-out\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.061577 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.061367 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.062886 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.062856 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.063142 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.063115 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.064200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.063464 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.064200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.063849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-web-config\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.064200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.063869 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.064200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.063116 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-config\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.064841 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.064817 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.064841 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.064831 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.065014 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.064907 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.069598 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.066028 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/74012b25-4329-404e-b0c9-0194ff3c8ce9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.069598 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.066228 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.069598 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.066507 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.069598 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.066948 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.069598 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.067042 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/74012b25-4329-404e-b0c9-0194ff3c8ce9-config-out\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.069598 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.067731 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/74012b25-4329-404e-b0c9-0194ff3c8ce9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.069598 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.068723 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/74012b25-4329-404e-b0c9-0194ff3c8ce9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.072801 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.072746 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5tc2\" (UniqueName: \"kubernetes.io/projected/74012b25-4329-404e-b0c9-0194ff3c8ce9-kube-api-access-c5tc2\") pod \"prometheus-k8s-0\" (UID: \"74012b25-4329-404e-b0c9-0194ff3c8ce9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.125329 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.125308 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8" path="/var/lib/kubelet/pods/f3ff3fa9-677b-493a-8ab9-5937fa0cf5c8/volumes" Apr 20 21:16:19.191357 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.191335 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:19.313894 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.313869 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 21:16:19.316555 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:16:19.316522 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74012b25_4329_404e_b0c9_0194ff3c8ce9.slice/crio-82b138ffeadf871bf07db2399745de51e4bc61a644848dde6bbb8effdac19690 WatchSource:0}: Error finding container 82b138ffeadf871bf07db2399745de51e4bc61a644848dde6bbb8effdac19690: Status 404 returned error can't find the container with id 82b138ffeadf871bf07db2399745de51e4bc61a644848dde6bbb8effdac19690 Apr 20 21:16:19.824930 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.824892 2567 generic.go:358] "Generic (PLEG): container finished" podID="74012b25-4329-404e-b0c9-0194ff3c8ce9" containerID="82b7db53687a9bfc89e7be3d1ae16325de358cb7747dbde01153a1fc66b8226a" exitCode=0 Apr 20 21:16:19.825359 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.824984 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74012b25-4329-404e-b0c9-0194ff3c8ce9","Type":"ContainerDied","Data":"82b7db53687a9bfc89e7be3d1ae16325de358cb7747dbde01153a1fc66b8226a"} Apr 20 21:16:19.825359 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:19.825028 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74012b25-4329-404e-b0c9-0194ff3c8ce9","Type":"ContainerStarted","Data":"82b138ffeadf871bf07db2399745de51e4bc61a644848dde6bbb8effdac19690"} Apr 20 21:16:20.832846 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:20.832808 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74012b25-4329-404e-b0c9-0194ff3c8ce9","Type":"ContainerStarted","Data":"487f5ec4a814630f9a52e53380033eb1e21c6c355998ce86d551f214f55dd7e1"} Apr 20 21:16:20.833228 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:20.832853 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74012b25-4329-404e-b0c9-0194ff3c8ce9","Type":"ContainerStarted","Data":"9841956259f95ced6905eceac282032610a6d424ddf7a3092b20ac55cacb75c7"} Apr 20 21:16:20.833228 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:20.832863 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74012b25-4329-404e-b0c9-0194ff3c8ce9","Type":"ContainerStarted","Data":"742fe51daf0890c62fb4769dbe1de9d12ffe7147d60ac5cdbc1399b3aeb94281"} Apr 20 21:16:20.833228 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:20.832873 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74012b25-4329-404e-b0c9-0194ff3c8ce9","Type":"ContainerStarted","Data":"99e9ec0fab80237851f0222658175fc4eda2a25b08f965003648ae8eceda0714"} Apr 20 21:16:20.833228 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:20.832882 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74012b25-4329-404e-b0c9-0194ff3c8ce9","Type":"ContainerStarted","Data":"6ff595fdb1706436d69f8ccc1edca22ec840cb0dc4f05c5ef4be91afce15e61e"} Apr 20 21:16:20.833228 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:20.832892 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"74012b25-4329-404e-b0c9-0194ff3c8ce9","Type":"ContainerStarted","Data":"c540f234c252ac6e763259d550983be9d05a69c1d26dfa821f83d7e65a97cf98"} Apr 20 21:16:20.860754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:20.860702 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.860685151 podStartE2EDuration="2.860685151s" podCreationTimestamp="2026-04-20 21:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:16:20.860494057 +0000 UTC m=+202.253679601" watchObservedRunningTime="2026-04-20 21:16:20.860685151 +0000 UTC m=+202.253870693" Apr 20 21:16:24.191602 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:24.191552 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:16:34.571040 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:34.571005 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-567b4d7d79-cdjlc"] Apr 20 21:16:59.589394 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.589356 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-567b4d7d79-cdjlc" podUID="b234f8c2-080c-4bbc-9324-43e9f84adc34" containerName="console" containerID="cri-o://de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b" gracePeriod=15 Apr 20 21:16:59.624855 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:59.624821 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb234f8c2_080c_4bbc_9324_43e9f84adc34.slice/crio-conmon-de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b.scope\": RecentStats: unable to find data in memory cache]" Apr 20 21:16:59.845416 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.845369 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-567b4d7d79-cdjlc_b234f8c2-080c-4bbc-9324-43e9f84adc34/console/0.log" Apr 20 21:16:59.845539 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.845449 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:16:59.955370 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.955344 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-567b4d7d79-cdjlc_b234f8c2-080c-4bbc-9324-43e9f84adc34/console/0.log" Apr 20 21:16:59.955515 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.955383 2567 generic.go:358] "Generic (PLEG): container finished" podID="b234f8c2-080c-4bbc-9324-43e9f84adc34" containerID="de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b" exitCode=2 Apr 20 21:16:59.955515 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.955462 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567b4d7d79-cdjlc" Apr 20 21:16:59.955515 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.955473 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567b4d7d79-cdjlc" event={"ID":"b234f8c2-080c-4bbc-9324-43e9f84adc34","Type":"ContainerDied","Data":"de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b"} Apr 20 21:16:59.955515 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.955506 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567b4d7d79-cdjlc" event={"ID":"b234f8c2-080c-4bbc-9324-43e9f84adc34","Type":"ContainerDied","Data":"24417bba11e280acce65e8c4bd31accea85def40911520f8f47bdd3ed5ba7097"} Apr 20 21:16:59.955663 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.955524 2567 scope.go:117] "RemoveContainer" containerID="de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b" Apr 20 21:16:59.962972 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.962957 2567 scope.go:117] "RemoveContainer" containerID="de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b" Apr 20 21:16:59.963210 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:16:59.963194 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b\": container with ID starting with de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b not found: ID does not exist" containerID="de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b" Apr 20 21:16:59.963252 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.963218 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b"} err="failed to get container status \"de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b\": rpc error: code = NotFound desc = could not find container \"de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b\": container with ID starting with de997ffa6c5cc51bd7f755b3db464d762e2592b9180f4dc3b4b5bc199e01e43b not found: ID does not exist" Apr 20 21:16:59.968589 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.968575 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-oauth-serving-cert\") pod \"b234f8c2-080c-4bbc-9324-43e9f84adc34\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " Apr 20 21:16:59.968645 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.968619 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-config\") pod \"b234f8c2-080c-4bbc-9324-43e9f84adc34\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " Apr 20 21:16:59.968679 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.968655 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq2ld\" (UniqueName: \"kubernetes.io/projected/b234f8c2-080c-4bbc-9324-43e9f84adc34-kube-api-access-kq2ld\") pod \"b234f8c2-080c-4bbc-9324-43e9f84adc34\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " Apr 20 21:16:59.968716 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.968691 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-serving-cert\") pod \"b234f8c2-080c-4bbc-9324-43e9f84adc34\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " Apr 20 21:16:59.968716 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.968705 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-service-ca\") pod \"b234f8c2-080c-4bbc-9324-43e9f84adc34\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " Apr 20 21:16:59.968812 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.968723 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-trusted-ca-bundle\") pod \"b234f8c2-080c-4bbc-9324-43e9f84adc34\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " Apr 20 21:16:59.968812 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.968765 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-oauth-config\") pod \"b234f8c2-080c-4bbc-9324-43e9f84adc34\" (UID: \"b234f8c2-080c-4bbc-9324-43e9f84adc34\") " Apr 20 21:16:59.969041 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.969015 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b234f8c2-080c-4bbc-9324-43e9f84adc34" (UID: "b234f8c2-080c-4bbc-9324-43e9f84adc34"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:59.969147 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.969125 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-config" (OuterVolumeSpecName: "console-config") pod "b234f8c2-080c-4bbc-9324-43e9f84adc34" (UID: "b234f8c2-080c-4bbc-9324-43e9f84adc34"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:59.969197 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.969154 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-service-ca" (OuterVolumeSpecName: "service-ca") pod "b234f8c2-080c-4bbc-9324-43e9f84adc34" (UID: "b234f8c2-080c-4bbc-9324-43e9f84adc34"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:59.969197 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.969160 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b234f8c2-080c-4bbc-9324-43e9f84adc34" (UID: "b234f8c2-080c-4bbc-9324-43e9f84adc34"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:16:59.970686 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.970661 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b234f8c2-080c-4bbc-9324-43e9f84adc34-kube-api-access-kq2ld" (OuterVolumeSpecName: "kube-api-access-kq2ld") pod "b234f8c2-080c-4bbc-9324-43e9f84adc34" (UID: "b234f8c2-080c-4bbc-9324-43e9f84adc34"). InnerVolumeSpecName "kube-api-access-kq2ld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:16:59.970812 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.970794 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b234f8c2-080c-4bbc-9324-43e9f84adc34" (UID: "b234f8c2-080c-4bbc-9324-43e9f84adc34"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:16:59.970876 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:16:59.970837 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b234f8c2-080c-4bbc-9324-43e9f84adc34" (UID: "b234f8c2-080c-4bbc-9324-43e9f84adc34"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:17:00.069456 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.069414 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-serving-cert\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:17:00.069456 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.069455 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-service-ca\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:17:00.069576 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.069467 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-trusted-ca-bundle\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:17:00.069576 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.069476 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-oauth-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:17:00.069576 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.069486 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-oauth-serving-cert\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:17:00.069576 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.069495 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b234f8c2-080c-4bbc-9324-43e9f84adc34-console-config\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:17:00.069576 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.069503 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kq2ld\" (UniqueName: \"kubernetes.io/projected/b234f8c2-080c-4bbc-9324-43e9f84adc34-kube-api-access-kq2ld\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:17:00.276900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.276876 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-567b4d7d79-cdjlc"] Apr 20 21:17:00.281848 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:00.281829 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-567b4d7d79-cdjlc"] Apr 20 21:17:01.132023 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:01.131988 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b234f8c2-080c-4bbc-9324-43e9f84adc34" path="/var/lib/kubelet/pods/b234f8c2-080c-4bbc-9324-43e9f84adc34/volumes" Apr 20 21:17:19.191948 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:19.191863 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:17:19.206859 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:19.206831 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:17:20.029667 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:20.029641 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 21:17:59.006873 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:17:59.006847 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 21:18:18.108577 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.108543 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8"] Apr 20 21:18:18.111069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.108820 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b234f8c2-080c-4bbc-9324-43e9f84adc34" containerName="console" Apr 20 21:18:18.111069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.108831 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b234f8c2-080c-4bbc-9324-43e9f84adc34" containerName="console" Apr 20 21:18:18.111069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.108916 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b234f8c2-080c-4bbc-9324-43e9f84adc34" containerName="console" Apr 20 21:18:18.111832 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.111813 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.114252 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.114232 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 21:18:18.114356 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.114286 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-hgbvg\"" Apr 20 21:18:18.114447 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.114410 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 21:18:18.114648 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.114630 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 21:18:18.114754 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.114706 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 21:18:18.125276 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.125248 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8"] Apr 20 21:18:18.200576 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.200546 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.200673 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.200599 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.200724 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.200707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jvb6\" (UniqueName: \"kubernetes.io/projected/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-kube-api-access-8jvb6\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.301160 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.301134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.301280 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.301181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.301280 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.301252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jvb6\" (UniqueName: \"kubernetes.io/projected/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-kube-api-access-8jvb6\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.303436 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.303398 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.303557 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.303536 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.309626 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.309592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jvb6\" (UniqueName: \"kubernetes.io/projected/8ac13ae9-ad58-4d35-8a94-aecd971a7ba1-kube-api-access-8jvb6\") pod \"opendatahub-operator-controller-manager-85fc55dd88-ssfl8\" (UID: \"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.421985 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.421929 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:18.567186 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.567156 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8"] Apr 20 21:18:18.569757 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:18:18.569729 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac13ae9_ad58_4d35_8a94_aecd971a7ba1.slice/crio-51dc3563cf36a94550f43df0086c82472edec16d1828615f090939cd10a921c0 WatchSource:0}: Error finding container 51dc3563cf36a94550f43df0086c82472edec16d1828615f090939cd10a921c0: Status 404 returned error can't find the container with id 51dc3563cf36a94550f43df0086c82472edec16d1828615f090939cd10a921c0 Apr 20 21:18:18.571742 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.571722 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:18:18.614975 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.614946 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf"] Apr 20 21:18:18.619375 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.619351 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.621757 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.621738 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 21:18:18.621846 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.621791 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:18:18.621969 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.621955 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 21:18:18.622163 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.622150 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 21:18:18.622411 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.622396 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-sgxkp\"" Apr 20 21:18:18.622478 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.622453 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 21:18:18.628829 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.628809 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf"] Apr 20 21:18:18.705069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.705011 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2c87\" (UniqueName: \"kubernetes.io/projected/b0a3a70e-2b48-4d4a-915c-4534bc49589b-kube-api-access-b2c87\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.705069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.705049 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0a3a70e-2b48-4d4a-915c-4534bc49589b-cert\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.705221 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.705074 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0a3a70e-2b48-4d4a-915c-4534bc49589b-metrics-cert\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.705221 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.705118 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b0a3a70e-2b48-4d4a-915c-4534bc49589b-manager-config\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.805567 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.805541 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0a3a70e-2b48-4d4a-915c-4534bc49589b-cert\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.805658 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.805576 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0a3a70e-2b48-4d4a-915c-4534bc49589b-metrics-cert\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.805658 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.805595 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b0a3a70e-2b48-4d4a-915c-4534bc49589b-manager-config\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.805658 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.805647 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2c87\" (UniqueName: \"kubernetes.io/projected/b0a3a70e-2b48-4d4a-915c-4534bc49589b-kube-api-access-b2c87\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.806242 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.806223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b0a3a70e-2b48-4d4a-915c-4534bc49589b-manager-config\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.808001 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.807975 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0a3a70e-2b48-4d4a-915c-4534bc49589b-metrics-cert\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.808084 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.807981 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0a3a70e-2b48-4d4a-915c-4534bc49589b-cert\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.818000 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.817982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2c87\" (UniqueName: \"kubernetes.io/projected/b0a3a70e-2b48-4d4a-915c-4534bc49589b-kube-api-access-b2c87\") pod \"lws-controller-manager-56b87855f9-gzqbf\" (UID: \"b0a3a70e-2b48-4d4a-915c-4534bc49589b\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:18.929937 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:18.929915 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:19.048778 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:19.048716 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf"] Apr 20 21:18:19.051600 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:18:19.051571 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a3a70e_2b48_4d4a_915c_4534bc49589b.slice/crio-a5556f1fc93ace096d7b8a82bb15ad451fcd7a72f92f7ec9f68b15d1c1992cd4 WatchSource:0}: Error finding container a5556f1fc93ace096d7b8a82bb15ad451fcd7a72f92f7ec9f68b15d1c1992cd4: Status 404 returned error can't find the container with id a5556f1fc93ace096d7b8a82bb15ad451fcd7a72f92f7ec9f68b15d1c1992cd4 Apr 20 21:18:19.187370 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:19.187331 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" event={"ID":"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1","Type":"ContainerStarted","Data":"51dc3563cf36a94550f43df0086c82472edec16d1828615f090939cd10a921c0"} Apr 20 21:18:19.190517 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:19.190485 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" event={"ID":"b0a3a70e-2b48-4d4a-915c-4534bc49589b","Type":"ContainerStarted","Data":"a5556f1fc93ace096d7b8a82bb15ad451fcd7a72f92f7ec9f68b15d1c1992cd4"} Apr 20 21:18:23.208169 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:23.208131 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" event={"ID":"8ac13ae9-ad58-4d35-8a94-aecd971a7ba1","Type":"ContainerStarted","Data":"d3485413fdb507fb654d8a33cd26f61fba7e47fee05e7f64630428c9fc631042"} Apr 20 21:18:23.208643 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:23.208331 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:23.209506 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:23.209484 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" event={"ID":"b0a3a70e-2b48-4d4a-915c-4534bc49589b","Type":"ContainerStarted","Data":"2d4b96c6e1874881b14bf7f00f86e6214800009ecdd53659670c7888c17620ec"} Apr 20 21:18:23.209678 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:23.209665 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:23.228015 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:23.227950 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" podStartSLOduration=1.541862718 podStartE2EDuration="5.227936698s" podCreationTimestamp="2026-04-20 21:18:18 +0000 UTC" firstStartedPulling="2026-04-20 21:18:18.571896585 +0000 UTC m=+319.965082119" lastFinishedPulling="2026-04-20 21:18:22.257970581 +0000 UTC m=+323.651156099" observedRunningTime="2026-04-20 21:18:23.226650436 +0000 UTC m=+324.619835978" watchObservedRunningTime="2026-04-20 21:18:23.227936698 +0000 UTC m=+324.621122240" Apr 20 21:18:23.243791 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:23.243754 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" podStartSLOduration=1.987803293 podStartE2EDuration="5.243744131s" podCreationTimestamp="2026-04-20 21:18:18 +0000 UTC" firstStartedPulling="2026-04-20 21:18:19.054142619 +0000 UTC m=+320.447328138" lastFinishedPulling="2026-04-20 21:18:22.310083443 +0000 UTC m=+323.703268976" observedRunningTime="2026-04-20 21:18:23.242021457 +0000 UTC m=+324.635206996" watchObservedRunningTime="2026-04-20 21:18:23.243744131 +0000 UTC m=+324.636929672" Apr 20 21:18:34.214760 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:34.214731 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-ssfl8" Apr 20 21:18:34.215133 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:34.214914 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-gzqbf" Apr 20 21:18:37.010612 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.010542 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89"] Apr 20 21:18:37.013777 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.013757 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.016618 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.016593 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 21:18:37.016755 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.016705 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-9lwtz\"" Apr 20 21:18:37.016755 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.016735 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 21:18:37.016878 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.016713 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 21:18:37.016878 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.016713 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 21:18:37.023681 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.023663 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89"] Apr 20 21:18:37.140500 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.140470 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlwvc\" (UniqueName: \"kubernetes.io/projected/7ba62331-8f35-44e1-85e5-8797b1e56dea-kube-api-access-hlwvc\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.140639 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.140511 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7ba62331-8f35-44e1-85e5-8797b1e56dea-tls-certs\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.140639 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.140576 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7ba62331-8f35-44e1-85e5-8797b1e56dea-tmp\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.241005 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.240977 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlwvc\" (UniqueName: \"kubernetes.io/projected/7ba62331-8f35-44e1-85e5-8797b1e56dea-kube-api-access-hlwvc\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.241119 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.241019 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7ba62331-8f35-44e1-85e5-8797b1e56dea-tls-certs\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.241119 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.241059 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7ba62331-8f35-44e1-85e5-8797b1e56dea-tmp\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.243337 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.243315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7ba62331-8f35-44e1-85e5-8797b1e56dea-tmp\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.243947 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.243928 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7ba62331-8f35-44e1-85e5-8797b1e56dea-tls-certs\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.255775 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.255752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlwvc\" (UniqueName: \"kubernetes.io/projected/7ba62331-8f35-44e1-85e5-8797b1e56dea-kube-api-access-hlwvc\") pod \"kube-auth-proxy-b57dc9cf9-xqh89\" (UID: \"7ba62331-8f35-44e1-85e5-8797b1e56dea\") " pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.325749 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.325725 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" Apr 20 21:18:37.442343 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:37.442316 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89"] Apr 20 21:18:37.446336 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:18:37.446310 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ba62331_8f35_44e1_85e5_8797b1e56dea.slice/crio-d2ade2ec1f5c42d99b6d49d9587acf07d8d8685d861e9d3aa5009905ab838c26 WatchSource:0}: Error finding container d2ade2ec1f5c42d99b6d49d9587acf07d8d8685d861e9d3aa5009905ab838c26: Status 404 returned error can't find the container with id d2ade2ec1f5c42d99b6d49d9587acf07d8d8685d861e9d3aa5009905ab838c26 Apr 20 21:18:38.257749 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:38.257645 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" event={"ID":"7ba62331-8f35-44e1-85e5-8797b1e56dea","Type":"ContainerStarted","Data":"d2ade2ec1f5c42d99b6d49d9587acf07d8d8685d861e9d3aa5009905ab838c26"} Apr 20 21:18:41.268854 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:41.268820 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" event={"ID":"7ba62331-8f35-44e1-85e5-8797b1e56dea","Type":"ContainerStarted","Data":"d8974b1bcb75ca1acebca5d5eaaebbb95175d85c9941ce970b3d75785b60050d"} Apr 20 21:18:41.286318 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:18:41.286259 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-b57dc9cf9-xqh89" podStartSLOduration=1.984244365 podStartE2EDuration="5.286240508s" podCreationTimestamp="2026-04-20 21:18:36 +0000 UTC" firstStartedPulling="2026-04-20 21:18:37.447983119 +0000 UTC m=+338.841168638" lastFinishedPulling="2026-04-20 21:18:40.749979249 +0000 UTC m=+342.143164781" observedRunningTime="2026-04-20 21:18:41.285177824 +0000 UTC m=+342.678363366" watchObservedRunningTime="2026-04-20 21:18:41.286240508 +0000 UTC m=+342.679426050" Apr 20 21:20:25.836635 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:25.836557 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6"] Apr 20 21:20:25.840114 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:25.840097 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:25.842738 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:25.842712 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-wnf2g\"" Apr 20 21:20:25.842738 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:25.842725 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:20:25.843024 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:25.842997 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:20:25.851874 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:25.851853 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6"] Apr 20 21:20:25.922618 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:25.922589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e82b4a1-efbe-4c17-ad3a-436edb78028d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" (UID: \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:25.922725 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:25.922662 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhcw\" (UniqueName: \"kubernetes.io/projected/6e82b4a1-efbe-4c17-ad3a-436edb78028d-kube-api-access-tfhcw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" (UID: \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:26.023508 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:26.023482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhcw\" (UniqueName: \"kubernetes.io/projected/6e82b4a1-efbe-4c17-ad3a-436edb78028d-kube-api-access-tfhcw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" (UID: \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:26.023617 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:26.023523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e82b4a1-efbe-4c17-ad3a-436edb78028d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" (UID: \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:26.023845 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:26.023829 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e82b4a1-efbe-4c17-ad3a-436edb78028d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" (UID: \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:26.034542 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:26.034516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhcw\" (UniqueName: \"kubernetes.io/projected/6e82b4a1-efbe-4c17-ad3a-436edb78028d-kube-api-access-tfhcw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" (UID: \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:26.149626 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:26.149576 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:26.267125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:26.267095 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6"] Apr 20 21:20:26.270677 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:20:26.270651 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e82b4a1_efbe_4c17_ad3a_436edb78028d.slice/crio-cc658bd415593ff6baff4967c57c8aefe0b253f7182a3af79ace596b46a5cb58 WatchSource:0}: Error finding container cc658bd415593ff6baff4967c57c8aefe0b253f7182a3af79ace596b46a5cb58: Status 404 returned error can't find the container with id cc658bd415593ff6baff4967c57c8aefe0b253f7182a3af79ace596b46a5cb58 Apr 20 21:20:26.586806 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:26.586781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" event={"ID":"6e82b4a1-efbe-4c17-ad3a-436edb78028d","Type":"ContainerStarted","Data":"cc658bd415593ff6baff4967c57c8aefe0b253f7182a3af79ace596b46a5cb58"} Apr 20 21:20:33.612628 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:33.612591 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" event={"ID":"6e82b4a1-efbe-4c17-ad3a-436edb78028d","Type":"ContainerStarted","Data":"9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6"} Apr 20 21:20:33.613000 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:33.612651 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:33.634456 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:33.634382 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" podStartSLOduration=2.125668482 podStartE2EDuration="8.634364136s" podCreationTimestamp="2026-04-20 21:20:25 +0000 UTC" firstStartedPulling="2026-04-20 21:20:26.272952319 +0000 UTC m=+447.666137840" lastFinishedPulling="2026-04-20 21:20:32.781647975 +0000 UTC m=+454.174833494" observedRunningTime="2026-04-20 21:20:33.631615007 +0000 UTC m=+455.024800547" watchObservedRunningTime="2026-04-20 21:20:33.634364136 +0000 UTC m=+455.027549678" Apr 20 21:20:44.617990 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:44.617961 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:46.324104 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.324065 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6"] Apr 20 21:20:46.324549 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.324291 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" containerName="manager" containerID="cri-o://9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6" gracePeriod=2 Apr 20 21:20:46.331123 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.331094 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6"] Apr 20 21:20:46.346408 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.346381 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h"] Apr 20 21:20:46.346725 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.346712 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" containerName="manager" Apr 20 21:20:46.346791 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.346727 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" containerName="manager" Apr 20 21:20:46.346791 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.346781 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" containerName="manager" Apr 20 21:20:46.349692 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.349672 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:46.362178 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.362153 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h"] Apr 20 21:20:46.379362 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.379324 2567 status_manager.go:895] "Failed to get status for pod" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" is forbidden: User \"system:node:ip-10-0-132-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-45.ec2.internal' and this object" Apr 20 21:20:46.476070 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.476034 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/de8803c1-9184-44d0-89b1-19b9f73e449a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wj8h\" (UID: \"de8803c1-9184-44d0-89b1-19b9f73e449a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:46.476216 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.476155 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ncl7\" (UniqueName: \"kubernetes.io/projected/de8803c1-9184-44d0-89b1-19b9f73e449a-kube-api-access-8ncl7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wj8h\" (UID: \"de8803c1-9184-44d0-89b1-19b9f73e449a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:46.549997 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.549978 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:46.552232 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.552208 2567 status_manager.go:895] "Failed to get status for pod" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" is forbidden: User \"system:node:ip-10-0-132-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-45.ec2.internal' and this object" Apr 20 21:20:46.577696 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.577641 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/de8803c1-9184-44d0-89b1-19b9f73e449a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wj8h\" (UID: \"de8803c1-9184-44d0-89b1-19b9f73e449a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:46.577786 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.577701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ncl7\" (UniqueName: \"kubernetes.io/projected/de8803c1-9184-44d0-89b1-19b9f73e449a-kube-api-access-8ncl7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wj8h\" (UID: \"de8803c1-9184-44d0-89b1-19b9f73e449a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:46.577990 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.577971 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/de8803c1-9184-44d0-89b1-19b9f73e449a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wj8h\" (UID: \"de8803c1-9184-44d0-89b1-19b9f73e449a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:46.585913 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.585884 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ncl7\" (UniqueName: \"kubernetes.io/projected/de8803c1-9184-44d0-89b1-19b9f73e449a-kube-api-access-8ncl7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wj8h\" (UID: \"de8803c1-9184-44d0-89b1-19b9f73e449a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:46.656599 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.656579 2567 generic.go:358] "Generic (PLEG): container finished" podID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" containerID="9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6" exitCode=0 Apr 20 21:20:46.656691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.656624 2567 scope.go:117] "RemoveContainer" containerID="9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6" Apr 20 21:20:46.656691 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.656623 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" Apr 20 21:20:46.658971 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.658946 2567 status_manager.go:895] "Failed to get status for pod" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" is forbidden: User \"system:node:ip-10-0-132-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-45.ec2.internal' and this object" Apr 20 21:20:46.664323 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.664307 2567 scope.go:117] "RemoveContainer" containerID="9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6" Apr 20 21:20:46.664627 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:20:46.664609 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6\": container with ID starting with 9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6 not found: ID does not exist" containerID="9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6" Apr 20 21:20:46.664685 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.664634 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6"} err="failed to get container status \"9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6\": rpc error: code = NotFound desc = could not find container \"9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6\": container with ID starting with 9f0e63c2d64836193f93f6537315543420d8dd5ae8b81260455184f2981e11a6 not found: ID does not exist" Apr 20 21:20:46.678482 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.678463 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfhcw\" (UniqueName: \"kubernetes.io/projected/6e82b4a1-efbe-4c17-ad3a-436edb78028d-kube-api-access-tfhcw\") pod \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\" (UID: \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\") " Apr 20 21:20:46.678562 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.678493 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e82b4a1-efbe-4c17-ad3a-436edb78028d-extensions-socket-volume\") pod \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\" (UID: \"6e82b4a1-efbe-4c17-ad3a-436edb78028d\") " Apr 20 21:20:46.678986 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.678966 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e82b4a1-efbe-4c17-ad3a-436edb78028d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6e82b4a1-efbe-4c17-ad3a-436edb78028d" (UID: "6e82b4a1-efbe-4c17-ad3a-436edb78028d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:20:46.680358 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.680338 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e82b4a1-efbe-4c17-ad3a-436edb78028d-kube-api-access-tfhcw" (OuterVolumeSpecName: "kube-api-access-tfhcw") pod "6e82b4a1-efbe-4c17-ad3a-436edb78028d" (UID: "6e82b4a1-efbe-4c17-ad3a-436edb78028d"). InnerVolumeSpecName "kube-api-access-tfhcw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:20:46.704653 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.704622 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:46.779164 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.779136 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfhcw\" (UniqueName: \"kubernetes.io/projected/6e82b4a1-efbe-4c17-ad3a-436edb78028d-kube-api-access-tfhcw\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:20:46.779258 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.779166 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e82b4a1-efbe-4c17-ad3a-436edb78028d-extensions-socket-volume\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:20:46.824345 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.824322 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h"] Apr 20 21:20:46.827751 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:20:46.827695 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8803c1_9184_44d0_89b1_19b9f73e449a.slice/crio-4d726854935f4a7445116e1c99ff1d06950058c7481c6f59a5f9fe86b6098bbb WatchSource:0}: Error finding container 4d726854935f4a7445116e1c99ff1d06950058c7481c6f59a5f9fe86b6098bbb: Status 404 returned error can't find the container with id 4d726854935f4a7445116e1c99ff1d06950058c7481c6f59a5f9fe86b6098bbb Apr 20 21:20:46.966334 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:46.966306 2567 status_manager.go:895] "Failed to get status for pod" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2crm6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-2crm6\" is forbidden: User \"system:node:ip-10-0-132-45.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-45.ec2.internal' and this object" Apr 20 21:20:47.126312 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:47.126240 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e82b4a1-efbe-4c17-ad3a-436edb78028d" path="/var/lib/kubelet/pods/6e82b4a1-efbe-4c17-ad3a-436edb78028d/volumes" Apr 20 21:20:47.661587 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:47.661550 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" event={"ID":"de8803c1-9184-44d0-89b1-19b9f73e449a","Type":"ContainerStarted","Data":"5b6c62ca45b98b8a8e7d076458395970f8a24bbe73d8b521221ab447c6604c3e"} Apr 20 21:20:47.661587 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:47.661590 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" event={"ID":"de8803c1-9184-44d0-89b1-19b9f73e449a","Type":"ContainerStarted","Data":"4d726854935f4a7445116e1c99ff1d06950058c7481c6f59a5f9fe86b6098bbb"} Apr 20 21:20:47.662004 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:47.661619 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:20:47.689639 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:47.689598 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" podStartSLOduration=1.689585051 podStartE2EDuration="1.689585051s" podCreationTimestamp="2026-04-20 21:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:20:47.687582526 +0000 UTC m=+469.080768068" watchObservedRunningTime="2026-04-20 21:20:47.689585051 +0000 UTC m=+469.082770591" Apr 20 21:20:58.668043 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:20:58.668004 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:21:01.507712 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.507680 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h"] Apr 20 21:21:01.508081 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.507915 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" podUID="de8803c1-9184-44d0-89b1-19b9f73e449a" containerName="manager" containerID="cri-o://5b6c62ca45b98b8a8e7d076458395970f8a24bbe73d8b521221ab447c6604c3e" gracePeriod=10 Apr 20 21:21:01.714940 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.714911 2567 generic.go:358] "Generic (PLEG): container finished" podID="de8803c1-9184-44d0-89b1-19b9f73e449a" containerID="5b6c62ca45b98b8a8e7d076458395970f8a24bbe73d8b521221ab447c6604c3e" exitCode=0 Apr 20 21:21:01.715083 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.714979 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" event={"ID":"de8803c1-9184-44d0-89b1-19b9f73e449a","Type":"ContainerDied","Data":"5b6c62ca45b98b8a8e7d076458395970f8a24bbe73d8b521221ab447c6604c3e"} Apr 20 21:21:01.737885 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.737864 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:21:01.781979 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.781958 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/de8803c1-9184-44d0-89b1-19b9f73e449a-extensions-socket-volume\") pod \"de8803c1-9184-44d0-89b1-19b9f73e449a\" (UID: \"de8803c1-9184-44d0-89b1-19b9f73e449a\") " Apr 20 21:21:01.782084 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.781990 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ncl7\" (UniqueName: \"kubernetes.io/projected/de8803c1-9184-44d0-89b1-19b9f73e449a-kube-api-access-8ncl7\") pod \"de8803c1-9184-44d0-89b1-19b9f73e449a\" (UID: \"de8803c1-9184-44d0-89b1-19b9f73e449a\") " Apr 20 21:21:01.782263 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.782243 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8803c1-9184-44d0-89b1-19b9f73e449a-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "de8803c1-9184-44d0-89b1-19b9f73e449a" (UID: "de8803c1-9184-44d0-89b1-19b9f73e449a"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:21:01.783862 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.783837 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8803c1-9184-44d0-89b1-19b9f73e449a-kube-api-access-8ncl7" (OuterVolumeSpecName: "kube-api-access-8ncl7") pod "de8803c1-9184-44d0-89b1-19b9f73e449a" (UID: "de8803c1-9184-44d0-89b1-19b9f73e449a"). InnerVolumeSpecName "kube-api-access-8ncl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:21:01.882704 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.882684 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/de8803c1-9184-44d0-89b1-19b9f73e449a-extensions-socket-volume\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:21:01.882704 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.882705 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ncl7\" (UniqueName: \"kubernetes.io/projected/de8803c1-9184-44d0-89b1-19b9f73e449a-kube-api-access-8ncl7\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:21:01.893903 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.893884 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq"] Apr 20 21:21:01.894181 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.894169 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de8803c1-9184-44d0-89b1-19b9f73e449a" containerName="manager" Apr 20 21:21:01.894235 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.894183 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8803c1-9184-44d0-89b1-19b9f73e449a" containerName="manager" Apr 20 21:21:01.894280 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.894242 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="de8803c1-9184-44d0-89b1-19b9f73e449a" containerName="manager" Apr 20 21:21:01.897292 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.897276 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:01.907108 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.907086 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq"] Apr 20 21:21:01.983607 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.983583 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmxst\" (UniqueName: \"kubernetes.io/projected/7870b953-b996-430f-9a37-63c6dd2c4c76-kube-api-access-rmxst\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2mmcq\" (UID: \"7870b953-b996-430f-9a37-63c6dd2c4c76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:01.983714 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:01.983627 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7870b953-b996-430f-9a37-63c6dd2c4c76-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2mmcq\" (UID: \"7870b953-b996-430f-9a37-63c6dd2c4c76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:02.084850 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.084801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7870b953-b996-430f-9a37-63c6dd2c4c76-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2mmcq\" (UID: \"7870b953-b996-430f-9a37-63c6dd2c4c76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:02.084960 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.084868 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmxst\" (UniqueName: \"kubernetes.io/projected/7870b953-b996-430f-9a37-63c6dd2c4c76-kube-api-access-rmxst\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2mmcq\" (UID: \"7870b953-b996-430f-9a37-63c6dd2c4c76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:02.085193 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.085172 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7870b953-b996-430f-9a37-63c6dd2c4c76-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2mmcq\" (UID: \"7870b953-b996-430f-9a37-63c6dd2c4c76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:02.103041 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.103018 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmxst\" (UniqueName: \"kubernetes.io/projected/7870b953-b996-430f-9a37-63c6dd2c4c76-kube-api-access-rmxst\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2mmcq\" (UID: \"7870b953-b996-430f-9a37-63c6dd2c4c76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:02.207183 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.207163 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:02.325889 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.325863 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq"] Apr 20 21:21:02.328415 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:21:02.328383 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7870b953_b996_430f_9a37_63c6dd2c4c76.slice/crio-c2e7262820679b3834678950a5011708b762f0540db57e9b79c1bdfaa115d39f WatchSource:0}: Error finding container c2e7262820679b3834678950a5011708b762f0540db57e9b79c1bdfaa115d39f: Status 404 returned error can't find the container with id c2e7262820679b3834678950a5011708b762f0540db57e9b79c1bdfaa115d39f Apr 20 21:21:02.719839 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.719760 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" event={"ID":"de8803c1-9184-44d0-89b1-19b9f73e449a","Type":"ContainerDied","Data":"4d726854935f4a7445116e1c99ff1d06950058c7481c6f59a5f9fe86b6098bbb"} Apr 20 21:21:02.719839 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.719809 2567 scope.go:117] "RemoveContainer" containerID="5b6c62ca45b98b8a8e7d076458395970f8a24bbe73d8b521221ab447c6604c3e" Apr 20 21:21:02.720282 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.719777 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h" Apr 20 21:21:02.721360 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.721330 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" event={"ID":"7870b953-b996-430f-9a37-63c6dd2c4c76","Type":"ContainerStarted","Data":"3fb2e8a600452d7c68d8e7427ffcdc6bb136333c13301cb8389206969ddb8ea3"} Apr 20 21:21:02.721468 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.721369 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" event={"ID":"7870b953-b996-430f-9a37-63c6dd2c4c76","Type":"ContainerStarted","Data":"c2e7262820679b3834678950a5011708b762f0540db57e9b79c1bdfaa115d39f"} Apr 20 21:21:02.721518 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.721469 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:02.743564 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.743504 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" podStartSLOduration=1.743487922 podStartE2EDuration="1.743487922s" podCreationTimestamp="2026-04-20 21:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:21:02.742183611 +0000 UTC m=+484.135369164" watchObservedRunningTime="2026-04-20 21:21:02.743487922 +0000 UTC m=+484.136673463" Apr 20 21:21:02.756366 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.756339 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h"] Apr 20 21:21:02.762166 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:02.762145 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wj8h"] Apr 20 21:21:03.126386 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:03.126356 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8803c1-9184-44d0-89b1-19b9f73e449a" path="/var/lib/kubelet/pods/de8803c1-9184-44d0-89b1-19b9f73e449a/volumes" Apr 20 21:21:13.727840 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:13.727806 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:21:31.495314 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.495282 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qn9xm"] Apr 20 21:21:31.498995 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.498972 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" Apr 20 21:21:31.501408 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.501382 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-pcgm8\"" Apr 20 21:21:31.503664 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.503642 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qn9xm"] Apr 20 21:21:31.599758 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.599729 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v5ng\" (UniqueName: \"kubernetes.io/projected/ae92afec-21af-4d92-b184-e3cc900abe9d-kube-api-access-2v5ng\") pod \"authorino-f99f4b5cd-qn9xm\" (UID: \"ae92afec-21af-4d92-b184-e3cc900abe9d\") " pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" Apr 20 21:21:31.700605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.700577 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v5ng\" (UniqueName: \"kubernetes.io/projected/ae92afec-21af-4d92-b184-e3cc900abe9d-kube-api-access-2v5ng\") pod \"authorino-f99f4b5cd-qn9xm\" (UID: \"ae92afec-21af-4d92-b184-e3cc900abe9d\") " pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" Apr 20 21:21:31.708451 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.708412 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v5ng\" (UniqueName: \"kubernetes.io/projected/ae92afec-21af-4d92-b184-e3cc900abe9d-kube-api-access-2v5ng\") pod \"authorino-f99f4b5cd-qn9xm\" (UID: \"ae92afec-21af-4d92-b184-e3cc900abe9d\") " pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" Apr 20 21:21:31.809005 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.808982 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" Apr 20 21:21:31.926952 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:31.926893 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qn9xm"] Apr 20 21:21:31.929217 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:21:31.929193 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae92afec_21af_4d92_b184_e3cc900abe9d.slice/crio-af7a0b8ed6250c302dc3a2d576987dadf88c1b672850012208388b07910698cc WatchSource:0}: Error finding container af7a0b8ed6250c302dc3a2d576987dadf88c1b672850012208388b07910698cc: Status 404 returned error can't find the container with id af7a0b8ed6250c302dc3a2d576987dadf88c1b672850012208388b07910698cc Apr 20 21:21:32.828021 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:32.827971 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" event={"ID":"ae92afec-21af-4d92-b184-e3cc900abe9d","Type":"ContainerStarted","Data":"af7a0b8ed6250c302dc3a2d576987dadf88c1b672850012208388b07910698cc"} Apr 20 21:21:35.238534 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:35.238498 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qn9xm"] Apr 20 21:21:35.839711 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:35.839674 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" event={"ID":"ae92afec-21af-4d92-b184-e3cc900abe9d","Type":"ContainerStarted","Data":"06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916"} Apr 20 21:21:35.857939 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:35.857894 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" podStartSLOduration=1.736435162 podStartE2EDuration="4.857881513s" podCreationTimestamp="2026-04-20 21:21:31 +0000 UTC" firstStartedPulling="2026-04-20 21:21:31.930472014 +0000 UTC m=+513.323657532" lastFinishedPulling="2026-04-20 21:21:35.051918364 +0000 UTC m=+516.445103883" observedRunningTime="2026-04-20 21:21:35.857013979 +0000 UTC m=+517.250199520" watchObservedRunningTime="2026-04-20 21:21:35.857881513 +0000 UTC m=+517.251067053" Apr 20 21:21:36.843374 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:36.843287 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" podUID="ae92afec-21af-4d92-b184-e3cc900abe9d" containerName="authorino" containerID="cri-o://06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916" gracePeriod=30 Apr 20 21:21:37.081213 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.081189 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" Apr 20 21:21:37.144076 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.144021 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v5ng\" (UniqueName: \"kubernetes.io/projected/ae92afec-21af-4d92-b184-e3cc900abe9d-kube-api-access-2v5ng\") pod \"ae92afec-21af-4d92-b184-e3cc900abe9d\" (UID: \"ae92afec-21af-4d92-b184-e3cc900abe9d\") " Apr 20 21:21:37.145915 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.145888 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae92afec-21af-4d92-b184-e3cc900abe9d-kube-api-access-2v5ng" (OuterVolumeSpecName: "kube-api-access-2v5ng") pod "ae92afec-21af-4d92-b184-e3cc900abe9d" (UID: "ae92afec-21af-4d92-b184-e3cc900abe9d"). InnerVolumeSpecName "kube-api-access-2v5ng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:21:37.250151 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.246523 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2v5ng\" (UniqueName: \"kubernetes.io/projected/ae92afec-21af-4d92-b184-e3cc900abe9d-kube-api-access-2v5ng\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:21:37.847762 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.847726 2567 generic.go:358] "Generic (PLEG): container finished" podID="ae92afec-21af-4d92-b184-e3cc900abe9d" containerID="06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916" exitCode=0 Apr 20 21:21:37.848193 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.847794 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" Apr 20 21:21:37.848193 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.847811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" event={"ID":"ae92afec-21af-4d92-b184-e3cc900abe9d","Type":"ContainerDied","Data":"06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916"} Apr 20 21:21:37.848193 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.847851 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qn9xm" event={"ID":"ae92afec-21af-4d92-b184-e3cc900abe9d","Type":"ContainerDied","Data":"af7a0b8ed6250c302dc3a2d576987dadf88c1b672850012208388b07910698cc"} Apr 20 21:21:37.848193 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.847867 2567 scope.go:117] "RemoveContainer" containerID="06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916" Apr 20 21:21:37.856617 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.856583 2567 scope.go:117] "RemoveContainer" containerID="06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916" Apr 20 21:21:37.856902 ip-10-0-132-45 kubenswrapper[2567]: E0420 21:21:37.856879 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916\": container with ID starting with 06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916 not found: ID does not exist" containerID="06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916" Apr 20 21:21:37.856949 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.856910 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916"} err="failed to get container status \"06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916\": rpc error: code = NotFound desc = could not find container \"06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916\": container with ID starting with 06af5403c7eafb56489e018854645f6ed8e8eb7001799100d74c05e379f3d916 not found: ID does not exist" Apr 20 21:21:37.870979 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.870952 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qn9xm"] Apr 20 21:21:37.875156 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:37.875134 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qn9xm"] Apr 20 21:21:39.125464 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:39.125434 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae92afec-21af-4d92-b184-e3cc900abe9d" path="/var/lib/kubelet/pods/ae92afec-21af-4d92-b184-e3cc900abe9d/volumes" Apr 20 21:21:47.765688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.765655 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-b5nll"] Apr 20 21:21:47.766126 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.765961 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae92afec-21af-4d92-b184-e3cc900abe9d" containerName="authorino" Apr 20 21:21:47.766126 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.765972 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae92afec-21af-4d92-b184-e3cc900abe9d" containerName="authorino" Apr 20 21:21:47.766126 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.766030 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae92afec-21af-4d92-b184-e3cc900abe9d" containerName="authorino" Apr 20 21:21:47.768827 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.768802 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:47.771352 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.771330 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-pqjgf\"" Apr 20 21:21:47.771479 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.771330 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 21:21:47.776859 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.776641 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-b5nll"] Apr 20 21:21:47.916198 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.916174 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj42p\" (UniqueName: \"kubernetes.io/projected/2884d6da-2400-4e61-9656-9e236c686a9c-kube-api-access-kj42p\") pod \"postgres-868db5846d-b5nll\" (UID: \"2884d6da-2400-4e61-9656-9e236c686a9c\") " pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:47.916324 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:47.916217 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2884d6da-2400-4e61-9656-9e236c686a9c-data\") pod \"postgres-868db5846d-b5nll\" (UID: \"2884d6da-2400-4e61-9656-9e236c686a9c\") " pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:48.017560 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:48.017500 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj42p\" (UniqueName: \"kubernetes.io/projected/2884d6da-2400-4e61-9656-9e236c686a9c-kube-api-access-kj42p\") pod \"postgres-868db5846d-b5nll\" (UID: \"2884d6da-2400-4e61-9656-9e236c686a9c\") " pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:48.017560 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:48.017546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2884d6da-2400-4e61-9656-9e236c686a9c-data\") pod \"postgres-868db5846d-b5nll\" (UID: \"2884d6da-2400-4e61-9656-9e236c686a9c\") " pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:48.017938 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:48.017919 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2884d6da-2400-4e61-9656-9e236c686a9c-data\") pod \"postgres-868db5846d-b5nll\" (UID: \"2884d6da-2400-4e61-9656-9e236c686a9c\") " pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:48.026534 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:48.026504 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj42p\" (UniqueName: \"kubernetes.io/projected/2884d6da-2400-4e61-9656-9e236c686a9c-kube-api-access-kj42p\") pod \"postgres-868db5846d-b5nll\" (UID: \"2884d6da-2400-4e61-9656-9e236c686a9c\") " pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:48.082471 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:48.082418 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:48.199087 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:48.199053 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-b5nll"] Apr 20 21:21:48.202954 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:21:48.202927 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2884d6da_2400_4e61_9656_9e236c686a9c.slice/crio-d74aec8d449d63c6b76f159b4558b987f85cb79c87f63a3dbc6b74f7a9e4891d WatchSource:0}: Error finding container d74aec8d449d63c6b76f159b4558b987f85cb79c87f63a3dbc6b74f7a9e4891d: Status 404 returned error can't find the container with id d74aec8d449d63c6b76f159b4558b987f85cb79c87f63a3dbc6b74f7a9e4891d Apr 20 21:21:48.896252 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:48.896209 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-b5nll" event={"ID":"2884d6da-2400-4e61-9656-9e236c686a9c","Type":"ContainerStarted","Data":"d74aec8d449d63c6b76f159b4558b987f85cb79c87f63a3dbc6b74f7a9e4891d"} Apr 20 21:21:53.915088 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:53.915057 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-b5nll" event={"ID":"2884d6da-2400-4e61-9656-9e236c686a9c","Type":"ContainerStarted","Data":"d457d402e6eef7e6de51557627cedfbbe36096e8b08797e1d9caec2475dace99"} Apr 20 21:21:53.915463 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:53.915168 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:21:53.931767 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:53.931715 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-b5nll" podStartSLOduration=2.147645414 podStartE2EDuration="6.931697024s" podCreationTimestamp="2026-04-20 21:21:47 +0000 UTC" firstStartedPulling="2026-04-20 21:21:48.204177394 +0000 UTC m=+529.597362913" lastFinishedPulling="2026-04-20 21:21:52.988229002 +0000 UTC m=+534.381414523" observedRunningTime="2026-04-20 21:21:53.929905417 +0000 UTC m=+535.323090958" watchObservedRunningTime="2026-04-20 21:21:53.931697024 +0000 UTC m=+535.324882568" Apr 20 21:21:59.948171 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:21:59.948143 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-b5nll" Apr 20 21:22:03.524191 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:03.524159 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5dc86d59f8-b5lxg"] Apr 20 21:22:03.709610 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:03.709578 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5dc86d59f8-b5lxg"] Apr 20 21:22:03.709781 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:03.709687 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:03.712608 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:03.712585 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-prkzx\"" Apr 20 21:22:03.834484 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:03.834460 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkp8c\" (UniqueName: \"kubernetes.io/projected/0fb71d8f-120e-4d3a-aa8f-95df7b06208b-kube-api-access-hkp8c\") pod \"maas-controller-5dc86d59f8-b5lxg\" (UID: \"0fb71d8f-120e-4d3a-aa8f-95df7b06208b\") " pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:03.935007 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:03.934982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkp8c\" (UniqueName: \"kubernetes.io/projected/0fb71d8f-120e-4d3a-aa8f-95df7b06208b-kube-api-access-hkp8c\") pod \"maas-controller-5dc86d59f8-b5lxg\" (UID: \"0fb71d8f-120e-4d3a-aa8f-95df7b06208b\") " pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:03.943428 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:03.943407 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkp8c\" (UniqueName: \"kubernetes.io/projected/0fb71d8f-120e-4d3a-aa8f-95df7b06208b-kube-api-access-hkp8c\") pod \"maas-controller-5dc86d59f8-b5lxg\" (UID: \"0fb71d8f-120e-4d3a-aa8f-95df7b06208b\") " pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:04.019711 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:04.019686 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:04.344224 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:04.344151 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5dc86d59f8-b5lxg"] Apr 20 21:22:04.346467 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:22:04.346394 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fb71d8f_120e_4d3a_aa8f_95df7b06208b.slice/crio-d88746ab44874e42a2eb22bff703259a473fda85f47aaabb7f63e31fefa9c18c WatchSource:0}: Error finding container d88746ab44874e42a2eb22bff703259a473fda85f47aaabb7f63e31fefa9c18c: Status 404 returned error can't find the container with id d88746ab44874e42a2eb22bff703259a473fda85f47aaabb7f63e31fefa9c18c Apr 20 21:22:04.954385 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:04.954339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" event={"ID":"0fb71d8f-120e-4d3a-aa8f-95df7b06208b","Type":"ContainerStarted","Data":"d88746ab44874e42a2eb22bff703259a473fda85f47aaabb7f63e31fefa9c18c"} Apr 20 21:22:06.964014 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:06.963979 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" event={"ID":"0fb71d8f-120e-4d3a-aa8f-95df7b06208b","Type":"ContainerStarted","Data":"22e6d5e4fba9661b730818d43d00e08d3ed4adf081d50aea3be4464b75d09873"} Apr 20 21:22:06.964479 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:06.964092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:06.981217 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:06.981174 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" podStartSLOduration=1.808281927 podStartE2EDuration="3.981161788s" podCreationTimestamp="2026-04-20 21:22:03 +0000 UTC" firstStartedPulling="2026-04-20 21:22:04.347797241 +0000 UTC m=+545.740982761" lastFinishedPulling="2026-04-20 21:22:06.520677102 +0000 UTC m=+547.913862622" observedRunningTime="2026-04-20 21:22:06.979636452 +0000 UTC m=+548.372822000" watchObservedRunningTime="2026-04-20 21:22:06.981161788 +0000 UTC m=+548.374347330" Apr 20 21:22:17.972915 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:17.972886 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:31.855977 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:31.855945 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5dc86d59f8-b5lxg"] Apr 20 21:22:31.856369 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:31.856185 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" podUID="0fb71d8f-120e-4d3a-aa8f-95df7b06208b" containerName="manager" containerID="cri-o://22e6d5e4fba9661b730818d43d00e08d3ed4adf081d50aea3be4464b75d09873" gracePeriod=10 Apr 20 21:22:32.053992 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:32.053964 2567 generic.go:358] "Generic (PLEG): container finished" podID="0fb71d8f-120e-4d3a-aa8f-95df7b06208b" containerID="22e6d5e4fba9661b730818d43d00e08d3ed4adf081d50aea3be4464b75d09873" exitCode=0 Apr 20 21:22:32.054111 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:32.054000 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" event={"ID":"0fb71d8f-120e-4d3a-aa8f-95df7b06208b","Type":"ContainerDied","Data":"22e6d5e4fba9661b730818d43d00e08d3ed4adf081d50aea3be4464b75d09873"} Apr 20 21:22:32.089771 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:32.089752 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:32.131544 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:32.131488 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkp8c\" (UniqueName: \"kubernetes.io/projected/0fb71d8f-120e-4d3a-aa8f-95df7b06208b-kube-api-access-hkp8c\") pod \"0fb71d8f-120e-4d3a-aa8f-95df7b06208b\" (UID: \"0fb71d8f-120e-4d3a-aa8f-95df7b06208b\") " Apr 20 21:22:32.133465 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:32.133438 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb71d8f-120e-4d3a-aa8f-95df7b06208b-kube-api-access-hkp8c" (OuterVolumeSpecName: "kube-api-access-hkp8c") pod "0fb71d8f-120e-4d3a-aa8f-95df7b06208b" (UID: "0fb71d8f-120e-4d3a-aa8f-95df7b06208b"). InnerVolumeSpecName "kube-api-access-hkp8c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:22:32.232737 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:32.232713 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkp8c\" (UniqueName: \"kubernetes.io/projected/0fb71d8f-120e-4d3a-aa8f-95df7b06208b-kube-api-access-hkp8c\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:22:33.058851 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:33.058806 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" event={"ID":"0fb71d8f-120e-4d3a-aa8f-95df7b06208b","Type":"ContainerDied","Data":"d88746ab44874e42a2eb22bff703259a473fda85f47aaabb7f63e31fefa9c18c"} Apr 20 21:22:33.058851 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:33.058851 2567 scope.go:117] "RemoveContainer" containerID="22e6d5e4fba9661b730818d43d00e08d3ed4adf081d50aea3be4464b75d09873" Apr 20 21:22:33.059392 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:33.058847 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5dc86d59f8-b5lxg" Apr 20 21:22:33.083581 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:33.081381 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5dc86d59f8-b5lxg"] Apr 20 21:22:33.087077 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:33.087052 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5dc86d59f8-b5lxg"] Apr 20 21:22:33.127476 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:22:33.127451 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb71d8f-120e-4d3a-aa8f-95df7b06208b" path="/var/lib/kubelet/pods/0fb71d8f-120e-4d3a-aa8f-95df7b06208b/volumes" Apr 20 21:23:23.231483 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.231447 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr"] Apr 20 21:23:23.232813 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.231871 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fb71d8f-120e-4d3a-aa8f-95df7b06208b" containerName="manager" Apr 20 21:23:23.232813 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.231885 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb71d8f-120e-4d3a-aa8f-95df7b06208b" containerName="manager" Apr 20 21:23:23.232813 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.231941 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fb71d8f-120e-4d3a-aa8f-95df7b06208b" containerName="manager" Apr 20 21:23:23.233681 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.233665 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.235993 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.235968 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 21:23:23.237033 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.237006 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 21:23:23.237161 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.237109 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-rl5wt\"" Apr 20 21:23:23.237161 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.237120 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 21:23:23.241291 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.241271 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr"] Apr 20 21:23:23.390591 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.390558 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0549a5a7-0007-4c5a-8055-17e6a5699efc-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.390729 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.390610 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.390729 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.390690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.390729 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.390723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgcw\" (UniqueName: \"kubernetes.io/projected/0549a5a7-0007-4c5a-8055-17e6a5699efc-kube-api-access-rdgcw\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.390887 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.390751 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.390887 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.390794 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492094 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492029 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgcw\" (UniqueName: \"kubernetes.io/projected/0549a5a7-0007-4c5a-8055-17e6a5699efc-kube-api-access-rdgcw\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492094 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492066 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492094 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492297 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492121 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0549a5a7-0007-4c5a-8055-17e6a5699efc-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492297 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492150 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492297 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492569 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492545 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492636 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492575 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.492636 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.492621 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.494313 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.494294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0549a5a7-0007-4c5a-8055-17e6a5699efc-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.494605 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.494585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0549a5a7-0007-4c5a-8055-17e6a5699efc-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.502719 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.502699 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgcw\" (UniqueName: \"kubernetes.io/projected/0549a5a7-0007-4c5a-8055-17e6a5699efc-kube-api-access-rdgcw\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr\" (UID: \"0549a5a7-0007-4c5a-8055-17e6a5699efc\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.543676 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.543659 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:23.662666 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.662638 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr"] Apr 20 21:23:23.664731 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:23:23.664703 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0549a5a7_0007_4c5a_8055_17e6a5699efc.slice/crio-5f95e9b8e836c25d845f906052a297954c8b87b563874738d10ee6fd6745544c WatchSource:0}: Error finding container 5f95e9b8e836c25d845f906052a297954c8b87b563874738d10ee6fd6745544c: Status 404 returned error can't find the container with id 5f95e9b8e836c25d845f906052a297954c8b87b563874738d10ee6fd6745544c Apr 20 21:23:23.666368 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:23.666352 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:23:24.230327 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:24.230287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" event={"ID":"0549a5a7-0007-4c5a-8055-17e6a5699efc","Type":"ContainerStarted","Data":"5f95e9b8e836c25d845f906052a297954c8b87b563874738d10ee6fd6745544c"} Apr 20 21:23:29.249315 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:29.249282 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" event={"ID":"0549a5a7-0007-4c5a-8055-17e6a5699efc","Type":"ContainerStarted","Data":"04e388c8cf07aea2143034d25ab4c6c56fab6f2c3fb68dc29c29a8c0bce116bd"} Apr 20 21:23:34.268664 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:34.268585 2567 generic.go:358] "Generic (PLEG): container finished" podID="0549a5a7-0007-4c5a-8055-17e6a5699efc" containerID="04e388c8cf07aea2143034d25ab4c6c56fab6f2c3fb68dc29c29a8c0bce116bd" exitCode=0 Apr 20 21:23:34.268990 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:34.268664 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" event={"ID":"0549a5a7-0007-4c5a-8055-17e6a5699efc","Type":"ContainerDied","Data":"04e388c8cf07aea2143034d25ab4c6c56fab6f2c3fb68dc29c29a8c0bce116bd"} Apr 20 21:23:36.278975 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:36.278946 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" event={"ID":"0549a5a7-0007-4c5a-8055-17e6a5699efc","Type":"ContainerStarted","Data":"013c6fa7cc6dd768903feab657dc9b7aa7d1ecaf038346ca8893cac803753495"} Apr 20 21:23:36.279328 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:36.279146 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:36.296534 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:36.296493 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" podStartSLOduration=1.6422524630000002 podStartE2EDuration="13.296480585s" podCreationTimestamp="2026-04-20 21:23:23 +0000 UTC" firstStartedPulling="2026-04-20 21:23:23.666495709 +0000 UTC m=+625.059681228" lastFinishedPulling="2026-04-20 21:23:35.320723828 +0000 UTC m=+636.713909350" observedRunningTime="2026-04-20 21:23:36.29510345 +0000 UTC m=+637.688288991" watchObservedRunningTime="2026-04-20 21:23:36.296480585 +0000 UTC m=+637.689666123" Apr 20 21:23:43.440971 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.440935 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g"] Apr 20 21:23:43.444661 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.444637 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.447485 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.447459 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 21:23:43.451603 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.451583 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g"] Apr 20 21:23:43.453326 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.453298 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e4708fc-8681-42d6-9131-fa9d09165d81-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.453467 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.453358 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.453467 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.453448 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.453600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.453480 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.453600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.453510 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lhf\" (UniqueName: \"kubernetes.io/projected/3e4708fc-8681-42d6-9131-fa9d09165d81-kube-api-access-m5lhf\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.453600 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.453541 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554117 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554084 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554117 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554118 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554296 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554138 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lhf\" (UniqueName: \"kubernetes.io/projected/3e4708fc-8681-42d6-9131-fa9d09165d81-kube-api-access-m5lhf\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554296 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554397 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554326 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e4708fc-8681-42d6-9131-fa9d09165d81-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554397 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554374 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554591 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554571 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554657 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.554714 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.554694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.556260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.556238 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e4708fc-8681-42d6-9131-fa9d09165d81-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.556783 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.556759 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e4708fc-8681-42d6-9131-fa9d09165d81-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.561045 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.561025 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lhf\" (UniqueName: \"kubernetes.io/projected/3e4708fc-8681-42d6-9131-fa9d09165d81-kube-api-access-m5lhf\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g\" (UID: \"3e4708fc-8681-42d6-9131-fa9d09165d81\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.757876 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.757819 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:43.881236 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:43.881206 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g"] Apr 20 21:23:43.883702 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:23:43.883659 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e4708fc_8681_42d6_9131_fa9d09165d81.slice/crio-3aa9d7f49f7fdb3a32974f9c536c8296f700edf9fbcc252429715d59f049d2dc WatchSource:0}: Error finding container 3aa9d7f49f7fdb3a32974f9c536c8296f700edf9fbcc252429715d59f049d2dc: Status 404 returned error can't find the container with id 3aa9d7f49f7fdb3a32974f9c536c8296f700edf9fbcc252429715d59f049d2dc Apr 20 21:23:44.311195 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:44.311159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" event={"ID":"3e4708fc-8681-42d6-9131-fa9d09165d81","Type":"ContainerStarted","Data":"362ee598394a92c56a46441a6715a80864e6de504496f515d887288e64f6f277"} Apr 20 21:23:44.311195 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:44.311201 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" event={"ID":"3e4708fc-8681-42d6-9131-fa9d09165d81","Type":"ContainerStarted","Data":"3aa9d7f49f7fdb3a32974f9c536c8296f700edf9fbcc252429715d59f049d2dc"} Apr 20 21:23:45.730590 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.730557 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p"] Apr 20 21:23:45.867893 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.867856 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p"] Apr 20 21:23:45.868046 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.867975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:45.870400 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.870373 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 21:23:45.976539 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.976509 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:45.976717 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.976569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:45.976717 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.976617 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/176acc20-3591-4f08-a9eb-ae47e60db60f-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:45.976717 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.976667 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bsq8\" (UniqueName: \"kubernetes.io/projected/176acc20-3591-4f08-a9eb-ae47e60db60f-kube-api-access-2bsq8\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:45.976717 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.976708 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:45.976905 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:45.976758 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078081 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078238 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078238 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/176acc20-3591-4f08-a9eb-ae47e60db60f-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078238 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078180 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bsq8\" (UniqueName: \"kubernetes.io/projected/176acc20-3591-4f08-a9eb-ae47e60db60f-kube-api-access-2bsq8\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078238 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078474 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078571 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078547 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078690 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078604 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.078690 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.078624 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.080520 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.080493 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/176acc20-3591-4f08-a9eb-ae47e60db60f-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.080763 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.080745 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/176acc20-3591-4f08-a9eb-ae47e60db60f-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.086127 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.086105 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bsq8\" (UniqueName: \"kubernetes.io/projected/176acc20-3591-4f08-a9eb-ae47e60db60f-kube-api-access-2bsq8\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p\" (UID: \"176acc20-3591-4f08-a9eb-ae47e60db60f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.178236 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.178204 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:46.306682 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.306649 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p"] Apr 20 21:23:46.308347 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:23:46.308316 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod176acc20_3591_4f08_a9eb_ae47e60db60f.slice/crio-dd75e59a2706c2a36b1b460ae030840f4d935f3d27a6c11f08c2ed9441939795 WatchSource:0}: Error finding container dd75e59a2706c2a36b1b460ae030840f4d935f3d27a6c11f08c2ed9441939795: Status 404 returned error can't find the container with id dd75e59a2706c2a36b1b460ae030840f4d935f3d27a6c11f08c2ed9441939795 Apr 20 21:23:46.319219 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:46.319194 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" event={"ID":"176acc20-3591-4f08-a9eb-ae47e60db60f","Type":"ContainerStarted","Data":"dd75e59a2706c2a36b1b460ae030840f4d935f3d27a6c11f08c2ed9441939795"} Apr 20 21:23:47.295688 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:47.295651 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr" Apr 20 21:23:47.325126 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:47.325089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" event={"ID":"176acc20-3591-4f08-a9eb-ae47e60db60f","Type":"ContainerStarted","Data":"ab0c85c66e30290c613f18241e8e12734194b0dcd9989b11737b687dde17016c"} Apr 20 21:23:49.334721 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:49.334688 2567 generic.go:358] "Generic (PLEG): container finished" podID="3e4708fc-8681-42d6-9131-fa9d09165d81" containerID="362ee598394a92c56a46441a6715a80864e6de504496f515d887288e64f6f277" exitCode=0 Apr 20 21:23:49.335119 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:49.334763 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" event={"ID":"3e4708fc-8681-42d6-9131-fa9d09165d81","Type":"ContainerDied","Data":"362ee598394a92c56a46441a6715a80864e6de504496f515d887288e64f6f277"} Apr 20 21:23:50.340070 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:50.340035 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" event={"ID":"3e4708fc-8681-42d6-9131-fa9d09165d81","Type":"ContainerStarted","Data":"7c678164933d75d308e6b2af7a936b5800f506eeb1ebb87e77019b1c53258fb4"} Apr 20 21:23:50.340606 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:50.340235 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:23:50.357904 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:50.357847 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" podStartSLOduration=7.174989105 podStartE2EDuration="7.357829514s" podCreationTimestamp="2026-04-20 21:23:43 +0000 UTC" firstStartedPulling="2026-04-20 21:23:49.335544637 +0000 UTC m=+650.728730159" lastFinishedPulling="2026-04-20 21:23:49.518385036 +0000 UTC m=+650.911570568" observedRunningTime="2026-04-20 21:23:50.356818844 +0000 UTC m=+651.750004396" watchObservedRunningTime="2026-04-20 21:23:50.357829514 +0000 UTC m=+651.751015059" Apr 20 21:23:55.357836 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:55.357803 2567 generic.go:358] "Generic (PLEG): container finished" podID="176acc20-3591-4f08-a9eb-ae47e60db60f" containerID="ab0c85c66e30290c613f18241e8e12734194b0dcd9989b11737b687dde17016c" exitCode=0 Apr 20 21:23:55.358216 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:55.357840 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" event={"ID":"176acc20-3591-4f08-a9eb-ae47e60db60f","Type":"ContainerDied","Data":"ab0c85c66e30290c613f18241e8e12734194b0dcd9989b11737b687dde17016c"} Apr 20 21:23:56.368811 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:56.368774 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" event={"ID":"176acc20-3591-4f08-a9eb-ae47e60db60f","Type":"ContainerStarted","Data":"7669b223ec3f046663db4402c89882062f2ddd6b91501a8548c673082e4215df"} Apr 20 21:23:56.369276 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:56.369002 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:23:56.385046 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:23:56.385003 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" podStartSLOduration=11.184214472 podStartE2EDuration="11.384992288s" podCreationTimestamp="2026-04-20 21:23:45 +0000 UTC" firstStartedPulling="2026-04-20 21:23:55.358355332 +0000 UTC m=+656.751540852" lastFinishedPulling="2026-04-20 21:23:55.559133135 +0000 UTC m=+656.952318668" observedRunningTime="2026-04-20 21:23:56.383850935 +0000 UTC m=+657.777036475" watchObservedRunningTime="2026-04-20 21:23:56.384992288 +0000 UTC m=+657.778177858" Apr 20 21:24:01.355900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:24:01.355867 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g" Apr 20 21:24:07.384826 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:24:07.384796 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p" Apr 20 21:36:04.257933 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:04.257905 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq"] Apr 20 21:36:04.260521 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:04.258133 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" podUID="7870b953-b996-430f-9a37-63c6dd2c4c76" containerName="manager" containerID="cri-o://3fb2e8a600452d7c68d8e7427ffcdc6bb136333c13301cb8389206969ddb8ea3" gracePeriod=10 Apr 20 21:36:04.801965 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:04.801935 2567 generic.go:358] "Generic (PLEG): container finished" podID="7870b953-b996-430f-9a37-63c6dd2c4c76" containerID="3fb2e8a600452d7c68d8e7427ffcdc6bb136333c13301cb8389206969ddb8ea3" exitCode=0 Apr 20 21:36:04.802106 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:04.802003 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" event={"ID":"7870b953-b996-430f-9a37-63c6dd2c4c76","Type":"ContainerDied","Data":"3fb2e8a600452d7c68d8e7427ffcdc6bb136333c13301cb8389206969ddb8ea3"} Apr 20 21:36:04.896560 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:04.896539 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:36:05.041596 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.041571 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7870b953-b996-430f-9a37-63c6dd2c4c76-extensions-socket-volume\") pod \"7870b953-b996-430f-9a37-63c6dd2c4c76\" (UID: \"7870b953-b996-430f-9a37-63c6dd2c4c76\") " Apr 20 21:36:05.041740 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.041650 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmxst\" (UniqueName: \"kubernetes.io/projected/7870b953-b996-430f-9a37-63c6dd2c4c76-kube-api-access-rmxst\") pod \"7870b953-b996-430f-9a37-63c6dd2c4c76\" (UID: \"7870b953-b996-430f-9a37-63c6dd2c4c76\") " Apr 20 21:36:05.041908 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.041887 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7870b953-b996-430f-9a37-63c6dd2c4c76-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "7870b953-b996-430f-9a37-63c6dd2c4c76" (UID: "7870b953-b996-430f-9a37-63c6dd2c4c76"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:36:05.043460 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.043439 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7870b953-b996-430f-9a37-63c6dd2c4c76-kube-api-access-rmxst" (OuterVolumeSpecName: "kube-api-access-rmxst") pod "7870b953-b996-430f-9a37-63c6dd2c4c76" (UID: "7870b953-b996-430f-9a37-63c6dd2c4c76"). InnerVolumeSpecName "kube-api-access-rmxst". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:36:05.143075 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.143054 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rmxst\" (UniqueName: \"kubernetes.io/projected/7870b953-b996-430f-9a37-63c6dd2c4c76-kube-api-access-rmxst\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:36:05.143159 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.143076 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7870b953-b996-430f-9a37-63c6dd2c4c76-extensions-socket-volume\") on node \"ip-10-0-132-45.ec2.internal\" DevicePath \"\"" Apr 20 21:36:05.807394 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.807358 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" event={"ID":"7870b953-b996-430f-9a37-63c6dd2c4c76","Type":"ContainerDied","Data":"c2e7262820679b3834678950a5011708b762f0540db57e9b79c1bdfaa115d39f"} Apr 20 21:36:05.807394 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.807385 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq" Apr 20 21:36:05.807932 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.807405 2567 scope.go:117] "RemoveContainer" containerID="3fb2e8a600452d7c68d8e7427ffcdc6bb136333c13301cb8389206969ddb8ea3" Apr 20 21:36:05.824572 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.824549 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq"] Apr 20 21:36:05.829223 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:05.829197 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2mmcq"] Apr 20 21:36:07.126623 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:36:07.126579 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7870b953-b996-430f-9a37-63c6dd2c4c76" path="/var/lib/kubelet/pods/7870b953-b996-430f-9a37-63c6dd2c4c76/volumes" Apr 20 21:37:10.509317 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.509283 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s"] Apr 20 21:37:10.509893 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.509636 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7870b953-b996-430f-9a37-63c6dd2c4c76" containerName="manager" Apr 20 21:37:10.509893 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.509647 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7870b953-b996-430f-9a37-63c6dd2c4c76" containerName="manager" Apr 20 21:37:10.509893 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.509699 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7870b953-b996-430f-9a37-63c6dd2c4c76" containerName="manager" Apr 20 21:37:10.512878 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.512856 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:10.517369 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.517347 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:37:10.517369 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.517365 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-wnf2g\"" Apr 20 21:37:10.517572 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.517367 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:37:10.525254 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.525234 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s"] Apr 20 21:37:10.620946 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.620923 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjz4p\" (UniqueName: \"kubernetes.io/projected/7e629527-2bda-465b-b830-373e66b5fa53-kube-api-access-fjz4p\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p4d9s\" (UID: \"7e629527-2bda-465b-b830-373e66b5fa53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:10.621050 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.620966 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e629527-2bda-465b-b830-373e66b5fa53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p4d9s\" (UID: \"7e629527-2bda-465b-b830-373e66b5fa53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:10.722133 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.722106 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjz4p\" (UniqueName: \"kubernetes.io/projected/7e629527-2bda-465b-b830-373e66b5fa53-kube-api-access-fjz4p\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p4d9s\" (UID: \"7e629527-2bda-465b-b830-373e66b5fa53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:10.722226 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.722158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e629527-2bda-465b-b830-373e66b5fa53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p4d9s\" (UID: \"7e629527-2bda-465b-b830-373e66b5fa53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:10.722585 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.722566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e629527-2bda-465b-b830-373e66b5fa53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p4d9s\" (UID: \"7e629527-2bda-465b-b830-373e66b5fa53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:10.731813 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.731791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjz4p\" (UniqueName: \"kubernetes.io/projected/7e629527-2bda-465b-b830-373e66b5fa53-kube-api-access-fjz4p\") pod \"kuadrant-operator-controller-manager-55c7f4c975-p4d9s\" (UID: \"7e629527-2bda-465b-b830-373e66b5fa53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:10.822667 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.822640 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:10.946718 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.946696 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s"] Apr 20 21:37:10.949248 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:37:10.949224 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e629527_2bda_465b_b830_373e66b5fa53.slice/crio-93cc5c156a424ae501057628a7e8f782fec731066725b75674e5ffe7ef46d7e8 WatchSource:0}: Error finding container 93cc5c156a424ae501057628a7e8f782fec731066725b75674e5ffe7ef46d7e8: Status 404 returned error can't find the container with id 93cc5c156a424ae501057628a7e8f782fec731066725b75674e5ffe7ef46d7e8 Apr 20 21:37:10.954459 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:10.952731 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:37:11.046230 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:11.046198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" event={"ID":"7e629527-2bda-465b-b830-373e66b5fa53","Type":"ContainerStarted","Data":"efefd602001faf2afc78a676e549aa1ae976e90d2a914850099c9c751b1ee783"} Apr 20 21:37:11.046230 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:11.046239 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" event={"ID":"7e629527-2bda-465b-b830-373e66b5fa53","Type":"ContainerStarted","Data":"93cc5c156a424ae501057628a7e8f782fec731066725b75674e5ffe7ef46d7e8"} Apr 20 21:37:11.046396 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:11.046303 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:37:11.066001 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:11.065966 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" podStartSLOduration=1.065956485 podStartE2EDuration="1.065956485s" podCreationTimestamp="2026-04-20 21:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:37:11.064387762 +0000 UTC m=+1452.457573303" watchObservedRunningTime="2026-04-20 21:37:11.065956485 +0000 UTC m=+1452.459142065" Apr 20 21:37:22.052160 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:37:22.052123 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-p4d9s" Apr 20 21:46:51.979491 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:51.979462 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-85fc55dd88-ssfl8_8ac13ae9-ad58-4d35-8a94-aecd971a7ba1/manager/0.log" Apr 20 21:46:52.215020 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:52.214987 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-b5nll_2884d6da-2400-4e61-9656-9e236c686a9c/postgres/0.log" Apr 20 21:46:54.094551 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:54.094518 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-p4d9s_7e629527-2bda-465b-b830-373e66b5fa53/manager/0.log" Apr 20 21:46:55.016260 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:55.016233 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-b57dc9cf9-xqh89_7ba62331-8f35-44e1-85e5-8797b1e56dea/kube-auth-proxy/0.log" Apr 20 21:46:55.592470 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:55.592412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr_0549a5a7-0007-4c5a-8055-17e6a5699efc/main/0.log" Apr 20 21:46:55.598880 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:55.598858 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-t2gxr_0549a5a7-0007-4c5a-8055-17e6a5699efc/storage-initializer/0.log" Apr 20 21:46:56.053928 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:56.053897 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g_3e4708fc-8681-42d6-9131-fa9d09165d81/storage-initializer/0.log" Apr 20 21:46:56.060730 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:56.060709 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-kg42g_3e4708fc-8681-42d6-9131-fa9d09165d81/main/0.log" Apr 20 21:46:56.177180 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:56.177155 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p_176acc20-3591-4f08-a9eb-ae47e60db60f/storage-initializer/0.log" Apr 20 21:46:56.184241 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:46:56.184222 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-fsj7p_176acc20-3591-4f08-a9eb-ae47e60db60f/main/0.log" Apr 20 21:47:03.002663 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:03.002631 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rjql2_c487441d-7d18-4b60-b94d-d6e7a1fdc1a0/global-pull-secret-syncer/0.log" Apr 20 21:47:03.060200 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:03.060178 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2m2ck_790f57a1-725b-4f47-abb1-5623730655e9/konnectivity-agent/0.log" Apr 20 21:47:03.199244 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:03.199218 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-45.ec2.internal_8209567a38135e157ae32e066a471fe4/haproxy/0.log" Apr 20 21:47:07.777401 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:07.777371 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-p4d9s_7e629527-2bda-465b-b830-373e66b5fa53/manager/0.log" Apr 20 21:47:09.448968 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.448938 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-47gc7_7c8850e0-79f1-40dd-be01-35e964ad62be/cluster-monitoring-operator/0.log" Apr 20 21:47:09.468616 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.468578 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jr5nj_f17ccda7-2ca6-4bc8-b586-635850795b77/kube-state-metrics/0.log" Apr 20 21:47:09.486578 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.486559 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jr5nj_f17ccda7-2ca6-4bc8-b586-635850795b77/kube-rbac-proxy-main/0.log" Apr 20 21:47:09.505339 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.505321 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jr5nj_f17ccda7-2ca6-4bc8-b586-635850795b77/kube-rbac-proxy-self/0.log" Apr 20 21:47:09.528999 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.528982 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6f8955cdbb-mw27t_af82c672-3cf2-41b2-9e5c-7784bf46eec5/metrics-server/0.log" Apr 20 21:47:09.579507 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.579478 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5nqj6_45874952-f43e-4c25-9d03-c35a06b5dbbd/node-exporter/0.log" Apr 20 21:47:09.596747 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.596728 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5nqj6_45874952-f43e-4c25-9d03-c35a06b5dbbd/kube-rbac-proxy/0.log" Apr 20 21:47:09.617030 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.617012 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5nqj6_45874952-f43e-4c25-9d03-c35a06b5dbbd/init-textfile/0.log" Apr 20 21:47:09.860740 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.860719 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74012b25-4329-404e-b0c9-0194ff3c8ce9/prometheus/0.log" Apr 20 21:47:09.880045 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.880027 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74012b25-4329-404e-b0c9-0194ff3c8ce9/config-reloader/0.log" Apr 20 21:47:09.901069 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.901050 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74012b25-4329-404e-b0c9-0194ff3c8ce9/thanos-sidecar/0.log" Apr 20 21:47:09.919757 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.919739 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74012b25-4329-404e-b0c9-0194ff3c8ce9/kube-rbac-proxy-web/0.log" Apr 20 21:47:09.941184 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.941161 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74012b25-4329-404e-b0c9-0194ff3c8ce9/kube-rbac-proxy/0.log" Apr 20 21:47:09.961234 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.961210 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74012b25-4329-404e-b0c9-0194ff3c8ce9/kube-rbac-proxy-thanos/0.log" Apr 20 21:47:09.980957 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:09.980943 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_74012b25-4329-404e-b0c9-0194ff3c8ce9/init-config-reloader/0.log" Apr 20 21:47:10.007606 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:10.007589 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-d65z5_11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a/prometheus-operator/0.log" Apr 20 21:47:10.026345 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:10.026327 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-d65z5_11766c9d-c9d9-4a44-b8ad-cd8bd0c89f4a/kube-rbac-proxy/0.log" Apr 20 21:47:10.051453 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:10.051411 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-8nwtj_8f276464-4ba8-4641-8b8f-67e5405b3f5e/prometheus-operator-admission-webhook/0.log" Apr 20 21:47:11.230587 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.230557 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-trkwb_777c3126-65de-4706-a59a-112f7ec7916a/networking-console-plugin/0.log" Apr 20 21:47:11.293900 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.293873 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs"] Apr 20 21:47:11.297295 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.297258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.299726 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.299709 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rs84k\"/\"kube-root-ca.crt\"" Apr 20 21:47:11.300470 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.300457 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rs84k\"/\"openshift-service-ca.crt\"" Apr 20 21:47:11.300542 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.300478 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rs84k\"/\"default-dockercfg-b9scs\"" Apr 20 21:47:11.306211 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.306187 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs"] Apr 20 21:47:11.370652 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.370620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-sys\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.370771 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.370660 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-podres\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.370771 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.370706 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-lib-modules\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.370771 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.370736 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-proc\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.370885 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.370777 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98j6\" (UniqueName: \"kubernetes.io/projected/ccb1e226-08fc-407a-b161-ebd06f880d9e-kube-api-access-m98j6\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471181 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m98j6\" (UniqueName: \"kubernetes.io/projected/ccb1e226-08fc-407a-b161-ebd06f880d9e-kube-api-access-m98j6\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471295 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-sys\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471295 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471211 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-podres\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471295 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471247 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-lib-modules\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471295 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-proc\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471493 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471298 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-sys\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471493 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471337 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-proc\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471493 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471341 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-podres\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.471493 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.471448 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccb1e226-08fc-407a-b161-ebd06f880d9e-lib-modules\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.478082 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.478066 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98j6\" (UniqueName: \"kubernetes.io/projected/ccb1e226-08fc-407a-b161-ebd06f880d9e-kube-api-access-m98j6\") pod \"perf-node-gather-daemonset-z4jxs\" (UID: \"ccb1e226-08fc-407a-b161-ebd06f880d9e\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.607125 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.607096 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:11.735567 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.735539 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs"] Apr 20 21:47:11.737981 ip-10-0-132-45 kubenswrapper[2567]: W0420 21:47:11.737950 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podccb1e226_08fc_407a_b161_ebd06f880d9e.slice/crio-8f132b7ccb5936c4f3c08f3acbe8f07fd6cf3d9c598e76876b0bf1848426ca61 WatchSource:0}: Error finding container 8f132b7ccb5936c4f3c08f3acbe8f07fd6cf3d9c598e76876b0bf1848426ca61: Status 404 returned error can't find the container with id 8f132b7ccb5936c4f3c08f3acbe8f07fd6cf3d9c598e76876b0bf1848426ca61 Apr 20 21:47:11.740054 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:11.740037 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:47:12.040662 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:12.040628 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" event={"ID":"ccb1e226-08fc-407a-b161-ebd06f880d9e","Type":"ContainerStarted","Data":"b482d32dae9bb5b3a2847f6ad680fec0bf80933620bab5a086e4b2096de53149"} Apr 20 21:47:12.040803 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:12.040678 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:12.040803 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:12.040696 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" event={"ID":"ccb1e226-08fc-407a-b161-ebd06f880d9e","Type":"ContainerStarted","Data":"8f132b7ccb5936c4f3c08f3acbe8f07fd6cf3d9c598e76876b0bf1848426ca61"} Apr 20 21:47:12.056413 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:12.056374 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" podStartSLOduration=1.056361597 podStartE2EDuration="1.056361597s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:47:12.054132671 +0000 UTC m=+2053.447318211" watchObservedRunningTime="2026-04-20 21:47:12.056361597 +0000 UTC m=+2053.449547139" Apr 20 21:47:13.485458 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:13.485411 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qq8hb_9ba83c57-a195-426d-ab9b-969d9434f8d7/dns/0.log" Apr 20 21:47:13.504599 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:13.504576 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qq8hb_9ba83c57-a195-426d-ab9b-969d9434f8d7/kube-rbac-proxy/0.log" Apr 20 21:47:13.546019 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:13.545999 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c5gzg_e05ebd33-1ace-4f1d-8379-0925d5e79b13/dns-node-resolver/0.log" Apr 20 21:47:14.046845 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:14.046816 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jk6wx_2a4eea4f-8eef-446c-a518-bcb5b140ee35/node-ca/0.log" Apr 20 21:47:14.957955 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:14.957921 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-b57dc9cf9-xqh89_7ba62331-8f35-44e1-85e5-8797b1e56dea/kube-auth-proxy/0.log" Apr 20 21:47:15.481098 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:15.481069 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9n92s_a0f24ff9-e85c-47ac-9ee4-3deb25a046a8/serve-healthcheck-canary/0.log" Apr 20 21:47:15.952091 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:15.952064 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gggf_2d4d9854-af8f-42c6-928a-3bbb105e3f5a/kube-rbac-proxy/0.log" Apr 20 21:47:15.971485 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:15.971461 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gggf_2d4d9854-af8f-42c6-928a-3bbb105e3f5a/exporter/0.log" Apr 20 21:47:15.994302 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:15.994281 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9gggf_2d4d9854-af8f-42c6-928a-3bbb105e3f5a/extractor/0.log" Apr 20 21:47:18.055263 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:18.055230 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-z4jxs" Apr 20 21:47:18.106111 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:18.106085 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-85fc55dd88-ssfl8_8ac13ae9-ad58-4d35-8a94-aecd971a7ba1/manager/0.log" Apr 20 21:47:18.216847 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:18.216826 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-b5nll_2884d6da-2400-4e61-9656-9e236c686a9c/postgres/0.log" Apr 20 21:47:19.274538 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:19.274511 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-56b87855f9-gzqbf_b0a3a70e-2b48-4d4a-915c-4534bc49589b/manager/0.log" Apr 20 21:47:24.916384 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:24.916355 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78h5h_b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0/kube-multus-additional-cni-plugins/0.log" Apr 20 21:47:24.937239 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:24.937214 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78h5h_b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0/egress-router-binary-copy/0.log" Apr 20 21:47:24.955529 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:24.955508 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78h5h_b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0/cni-plugins/0.log" Apr 20 21:47:24.974204 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:24.974185 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78h5h_b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0/bond-cni-plugin/0.log" Apr 20 21:47:24.992857 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:24.992838 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78h5h_b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0/routeoverride-cni/0.log" Apr 20 21:47:25.013941 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:25.013922 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78h5h_b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0/whereabouts-cni-bincopy/0.log" Apr 20 21:47:25.033404 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:25.033374 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-78h5h_b8e3dae1-8a79-4f24-a9f1-2cf6da0045a0/whereabouts-cni/0.log" Apr 20 21:47:25.351685 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:25.351658 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr6d2_27234e93-fd7f-443e-8c3a-28ab70606c45/kube-multus/0.log" Apr 20 21:47:25.393227 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:25.393192 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6p5ds_ec3d4534-0f04-46f4-8eae-d37ac21ac0c6/network-metrics-daemon/0.log" Apr 20 21:47:25.411250 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:25.411228 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6p5ds_ec3d4534-0f04-46f4-8eae-d37ac21ac0c6/kube-rbac-proxy/0.log" Apr 20 21:47:26.245838 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:26.245815 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qmcs_b73ab204-677f-45ce-8d9d-26e042fd308c/ovn-controller/0.log" Apr 20 21:47:26.278476 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:26.278454 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qmcs_b73ab204-677f-45ce-8d9d-26e042fd308c/ovn-acl-logging/0.log" Apr 20 21:47:26.295335 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:26.295320 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qmcs_b73ab204-677f-45ce-8d9d-26e042fd308c/kube-rbac-proxy-node/0.log" Apr 20 21:47:26.315095 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:26.315080 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qmcs_b73ab204-677f-45ce-8d9d-26e042fd308c/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 21:47:26.333611 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:26.333593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qmcs_b73ab204-677f-45ce-8d9d-26e042fd308c/northd/0.log" Apr 20 21:47:26.352473 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:26.352455 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qmcs_b73ab204-677f-45ce-8d9d-26e042fd308c/nbdb/0.log" Apr 20 21:47:26.374534 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:26.374514 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qmcs_b73ab204-677f-45ce-8d9d-26e042fd308c/sbdb/0.log" Apr 20 21:47:26.465544 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:26.465525 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5qmcs_b73ab204-677f-45ce-8d9d-26e042fd308c/ovnkube-controller/0.log" Apr 20 21:47:28.010504 ip-10-0-132-45 kubenswrapper[2567]: I0420 21:47:28.010474 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7pqmr_35b8e8ff-14c6-4807-bd77-b37eaea1544c/network-check-target-container/0.log"