Apr 21 03:54:41.153280 ip-10-0-131-182 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 03:54:41.153296 ip-10-0-131-182 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 03:54:41.153306 ip-10-0-131-182 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 03:54:41.153676 ip-10-0-131-182 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 03:54:51.322656 ip-10-0-131-182 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 03:54:51.322673 ip-10-0-131-182 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 17fd98aa91b0499ca6abb6991a27d8e3 -- Apr 21 03:56:59.338643 ip-10-0-131-182 systemd[1]: Starting Kubernetes Kubelet... Apr 21 03:56:59.682581 ip-10-0-131-182 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:56:59.682581 ip-10-0-131-182 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 03:56:59.682581 ip-10-0-131-182 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:56:59.682581 ip-10-0-131-182 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 03:56:59.682581 ip-10-0-131-182 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:56:59.685005 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.684910 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 03:56:59.688769 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688745 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:59.688769 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688766 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:59.688769 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688770 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:59.688769 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688774 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:59.688769 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688777 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688780 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688783 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688786 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688789 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688792 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688795 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688798 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688800 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688803 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688806 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688808 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688811 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688814 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688817 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688819 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688822 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688825 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688827 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688830 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:59.688971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688833 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688838 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688853 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688856 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688859 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688862 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688865 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688868 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688871 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688874 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688876 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688879 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688881 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688884 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688887 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688891 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688893 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688896 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688899 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:59.689465 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688902 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688905 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688908 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688911 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688913 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688916 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688918 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688921 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688923 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688926 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688928 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688931 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688934 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688937 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688940 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688942 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688945 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688947 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688950 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688953 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:59.689937 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688956 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688959 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688961 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688964 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688966 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688969 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688971 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688975 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688978 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688981 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688983 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688986 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688989 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688991 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688994 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688997 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.688999 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689002 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689006 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:59.690438 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689010 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689013 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689016 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689018 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689451 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689459 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689462 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689465 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689468 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689471 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689474 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689478 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689482 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689485 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689489 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689492 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689494 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689497 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689500 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:59.690914 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689503 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689505 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689508 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689511 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689513 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689516 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689519 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689521 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689524 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689527 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689529 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689532 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689535 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689538 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689540 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689543 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689545 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689548 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689551 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:59.691400 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689553 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689556 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689558 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689561 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689564 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689566 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689569 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689572 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689575 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689577 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689580 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689584 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689587 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689590 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689593 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689596 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689599 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689602 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689604 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689607 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:59.691875 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689610 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689612 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689615 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689617 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689620 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689623 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689625 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689628 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689630 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689633 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689635 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689639 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689642 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689644 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689647 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689649 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689652 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689654 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689657 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689659 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:59.692384 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689662 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689664 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689667 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689669 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689671 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689674 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689677 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689681 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689684 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689686 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689689 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.689691 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690162 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690171 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690177 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690182 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690187 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690190 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690195 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690199 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690202 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 03:56:59.692871 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690206 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690210 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690214 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690217 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690220 2578 flags.go:64] FLAG: --cgroup-root="" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690223 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690226 2578 flags.go:64] FLAG: --client-ca-file="" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690229 2578 flags.go:64] FLAG: --cloud-config="" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690232 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690235 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690240 2578 flags.go:64] FLAG: --cluster-domain="" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690243 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690246 2578 flags.go:64] FLAG: --config-dir="" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690249 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690253 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690257 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690260 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690263 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690267 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690270 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690272 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690276 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690279 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690282 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690286 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 03:56:59.693412 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690289 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690293 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690295 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690303 2578 flags.go:64] FLAG: --enable-server="true" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690318 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690324 2578 flags.go:64] FLAG: --event-burst="100" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690328 2578 flags.go:64] FLAG: --event-qps="50" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690331 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690335 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690338 2578 flags.go:64] FLAG: --eviction-hard="" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690342 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690345 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690349 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690351 2578 flags.go:64] FLAG: --eviction-soft="" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690354 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690357 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690360 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690363 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690366 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690369 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690372 2578 flags.go:64] FLAG: --feature-gates="" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690376 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690379 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690382 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690385 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690388 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 21 03:56:59.694020 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690391 2578 flags.go:64] FLAG: --help="false" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690394 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-131-182.ec2.internal" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690398 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690400 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690403 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690407 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690410 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690413 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690416 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690420 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690424 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690427 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690431 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690433 2578 flags.go:64] FLAG: --kube-reserved="" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690437 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690440 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690443 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690446 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690449 2578 flags.go:64] FLAG: --lock-file="" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690452 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690456 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690459 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690464 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 03:56:59.694687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690467 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690470 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690473 2578 flags.go:64] FLAG: --logging-format="text" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690475 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690479 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690482 2578 flags.go:64] FLAG: --manifest-url="" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690485 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690490 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690493 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690497 2578 flags.go:64] FLAG: --max-pods="110" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690500 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690503 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690506 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690509 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690512 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690515 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690518 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690526 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690529 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690533 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690537 2578 flags.go:64] FLAG: --pod-cidr="" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690540 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690546 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690549 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 03:56:59.695263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690552 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690555 2578 flags.go:64] FLAG: --port="10250" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690558 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690561 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c2f07de6b6366f43" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690564 2578 flags.go:64] FLAG: --qos-reserved="" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690567 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690570 2578 flags.go:64] FLAG: --register-node="true" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690573 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690576 2578 flags.go:64] FLAG: --register-with-taints="" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690579 2578 flags.go:64] FLAG: --registry-burst="10" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690582 2578 flags.go:64] FLAG: --registry-qps="5" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690585 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690587 2578 flags.go:64] FLAG: --reserved-memory="" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690591 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690594 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690597 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690600 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690602 2578 flags.go:64] FLAG: --runonce="false" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690605 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690608 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690611 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690614 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690617 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690620 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690623 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690626 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 03:56:59.695865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690629 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690632 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690636 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690639 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690642 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690646 2578 flags.go:64] FLAG: --system-cgroups="" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690649 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690654 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690657 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690659 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690664 2578 flags.go:64] FLAG: --tls-min-version="" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690667 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690670 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690673 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690676 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690679 2578 flags.go:64] FLAG: --v="2" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690683 2578 flags.go:64] FLAG: --version="false" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690687 2578 flags.go:64] FLAG: --vmodule="" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690692 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.690695 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690818 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690823 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690826 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690829 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:59.696524 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690832 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690835 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690837 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690840 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690843 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690845 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690848 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690850 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690853 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690856 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690858 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690861 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690867 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690872 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690874 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690877 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690880 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690882 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690886 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:59.697093 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690890 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690893 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690896 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690899 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690901 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690904 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690907 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690909 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690912 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690915 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690918 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690920 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690923 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690926 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690928 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690931 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690934 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690936 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690938 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690942 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:59.697646 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690944 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690947 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690949 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690952 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690954 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690957 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690962 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690965 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690967 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690969 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690972 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690976 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690980 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690983 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690985 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690988 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690990 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690993 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690995 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.690998 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:59.698153 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691001 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691003 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691006 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691008 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691010 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691013 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691015 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691018 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691020 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691023 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691026 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691028 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691030 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691033 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691037 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691040 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691042 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691045 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691048 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691052 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:59.698664 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691054 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691057 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.691059 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.691065 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.697886 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.697906 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697959 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697964 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697968 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697971 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697974 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697977 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697980 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697983 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697986 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697989 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:59.699159 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697991 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697994 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.697997 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698000 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698002 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698005 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698008 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698011 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698017 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698021 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698023 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698026 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698029 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698032 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698034 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698038 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698043 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698046 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698049 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:59.699606 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698052 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698056 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698062 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698065 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698068 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698070 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698074 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698076 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698079 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698082 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698084 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698087 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698089 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698092 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698095 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698097 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698100 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698103 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698106 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:59.700068 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698108 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698111 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698114 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698116 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698119 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698122 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698124 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698127 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698129 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698132 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698134 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698137 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698140 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698142 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698145 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698148 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698152 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698155 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698158 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:59.700545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698160 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698163 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698166 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698169 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698171 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698174 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698177 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698179 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698182 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698185 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698187 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698189 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698192 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698195 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698197 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698200 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698202 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698205 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:59.701054 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698208 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.698213 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698344 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698350 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698354 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698357 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698359 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698362 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698365 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698368 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698371 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698375 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698378 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698380 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698383 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698386 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:56:59.701636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698388 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698391 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698394 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698396 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698398 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698401 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698404 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698406 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698409 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698412 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698414 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698417 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698419 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698422 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698424 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698427 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698429 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698433 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698437 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:56:59.702044 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698440 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698442 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698445 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698447 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698450 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698452 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698455 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698457 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698460 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698463 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698466 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698468 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698471 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698474 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698476 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698479 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698481 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698483 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698486 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698489 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:56:59.702518 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698491 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698494 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698496 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698499 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698501 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698504 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698507 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698509 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698512 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698514 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698517 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698520 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698523 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698525 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698528 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698531 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698534 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698537 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698539 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698542 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:56:59.703012 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698544 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698547 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698549 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698554 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698557 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698561 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698564 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698567 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698569 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698572 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698575 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698578 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:56:59.698581 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.698585 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.699168 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 03:56:59.703512 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.701174 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 03:56:59.703882 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.702059 2578 server.go:1019] "Starting client certificate rotation" Apr 21 03:56:59.703882 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.702155 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:56:59.703882 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.702717 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:56:59.719998 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.719979 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:56:59.723909 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.723889 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:56:59.736285 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.736263 2578 log.go:25] "Validated CRI v1 runtime API" Apr 21 03:56:59.741916 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.741899 2578 log.go:25] "Validated CRI v1 image API" Apr 21 03:56:59.743110 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.743088 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 03:56:59.744268 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.744253 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:56:59.745580 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.745561 2578 fs.go:135] Filesystem UUIDs: map[24b91a5e-b602-4b5d-8c70-cfebbc13ea84:/dev/nvme0n1p4 47737fe6-96d3-4ac8-a925-3944e1f28214:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 03:56:59.745627 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.745581 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 03:56:59.750427 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.750302 2578 manager.go:217] Machine: {Timestamp:2026-04-21 03:56:59.749348634 +0000 UTC m=+0.316319117 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095516 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a31d45d654bec436a38fc99f8697a SystemUUID:ec2a31d4-5d65-4bec-436a-38fc99f8697a BootID:17fd98aa-91b0-499c-a6ab-b6991a27d8e3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ef:3d:88:34:f3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ef:3d:88:34:f3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:30:95:6d:7e:64 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 03:56:59.750427 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.750423 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 03:56:59.750531 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.750507 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 03:56:59.751384 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.751359 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 03:56:59.751516 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.751386 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-182.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 03:56:59.751600 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.751525 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 03:56:59.751600 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.751533 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 03:56:59.751600 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.751546 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:56:59.752155 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.752145 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:56:59.752892 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.752882 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:56:59.752996 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.752987 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 03:56:59.754711 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.754701 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 21 03:56:59.754767 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.754714 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 03:56:59.754767 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.754732 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 03:56:59.754767 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.754743 2578 kubelet.go:397] "Adding apiserver pod source" Apr 21 03:56:59.754767 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.754764 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 03:56:59.755628 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.755617 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:56:59.755663 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.755637 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:56:59.758214 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.758200 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 03:56:59.759444 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.759428 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 03:56:59.760893 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760879 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 03:56:59.760961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760899 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 03:56:59.760961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760908 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 03:56:59.760961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760917 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 03:56:59.760961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760934 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 03:56:59.760961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760942 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 03:56:59.760961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760951 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 03:56:59.760961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760960 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 03:56:59.761232 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760970 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 03:56:59.761232 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760979 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 03:56:59.761232 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.760998 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 03:56:59.761232 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.761011 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 03:56:59.761620 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.761608 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 03:56:59.761669 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.761625 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 03:56:59.764980 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.764964 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 03:56:59.765058 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.765010 2578 server.go:1295] "Started kubelet" Apr 21 03:56:59.765134 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.765102 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 03:56:59.765241 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.765172 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 03:56:59.765296 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.765248 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 03:56:59.767131 ip-10-0-131-182 systemd[1]: Started Kubernetes Kubelet. Apr 21 03:56:59.768232 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.767632 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jgkjt" Apr 21 03:56:59.768232 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.768013 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 03:56:59.768976 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.768956 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 21 03:56:59.771673 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.771657 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-182.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 03:56:59.772618 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.772398 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 03:56:59.772618 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.771758 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-182.ec2.internal.18a8431729f65324 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-182.ec2.internal,UID:ip-10-0-131-182.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-182.ec2.internal,},FirstTimestamp:2026-04-21 03:56:59.764978468 +0000 UTC m=+0.331948955,LastTimestamp:2026-04-21 03:56:59.764978468 +0000 UTC m=+0.331948955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-182.ec2.internal,}" Apr 21 03:56:59.773083 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.773047 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 03:56:59.774332 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.774296 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jgkjt" Apr 21 03:56:59.774420 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.774363 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 03:56:59.774420 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.774407 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 03:56:59.774942 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.774910 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 03:56:59.774942 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.774942 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 03:56:59.775103 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.774963 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 03:56:59.775103 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.775022 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 21 03:56:59.775201 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.775129 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 21 03:56:59.775446 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.775430 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:56:59.776067 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.776048 2578 factory.go:55] Registering systemd factory Apr 21 03:56:59.776150 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.776077 2578 factory.go:223] Registration of the systemd container factory successfully Apr 21 03:56:59.776369 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.776355 2578 factory.go:153] Registering CRI-O factory Apr 21 03:56:59.776369 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.776372 2578 factory.go:223] Registration of the crio container factory successfully Apr 21 03:56:59.776490 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.776437 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 03:56:59.776490 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.776459 2578 factory.go:103] Registering Raw factory Apr 21 03:56:59.776490 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.776488 2578 manager.go:1196] Started watching for new ooms in manager Apr 21 03:56:59.776784 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.776762 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 03:56:59.776908 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.776893 2578 manager.go:319] Starting recovery of all containers Apr 21 03:56:59.779884 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.779816 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 03:56:59.780254 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.780231 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 03:56:59.784248 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.784205 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 03:56:59.786204 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.786182 2578 manager.go:324] Recovery completed Apr 21 03:56:59.791380 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.791303 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:59.793495 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.793476 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:59.793539 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.793512 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:59.793539 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.793527 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:59.795796 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.795778 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 03:56:59.795796 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.795794 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 03:56:59.795897 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.795814 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:56:59.798942 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.798930 2578 policy_none.go:49] "None policy: Start" Apr 21 03:56:59.798993 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.798947 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 03:56:59.798993 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.798956 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.837528 2578 manager.go:341] "Starting Device Plugin manager" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.837576 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.837589 2578 server.go:85] "Starting device plugin registration server" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.837830 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.837841 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.837920 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.838011 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.838022 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.838500 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 03:56:59.851878 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.838538 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:56:59.871797 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.871770 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 03:56:59.871797 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.871803 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 03:56:59.871935 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.871824 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 03:56:59.871935 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.871831 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 03:56:59.871935 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.871862 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 03:56:59.875193 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.875173 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:56:59.938419 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.938350 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:59.939508 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.939487 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:59.939594 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.939523 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:59.939594 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.939534 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:59.939594 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.939571 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-182.ec2.internal" Apr 21 03:56:59.947585 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.947570 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-182.ec2.internal" Apr 21 03:56:59.947635 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.947592 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-182.ec2.internal\": node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:56:59.963860 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.963836 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:56:59.972624 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.972603 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal"] Apr 21 03:56:59.972686 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.972674 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:59.973556 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.973541 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:59.973630 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.973565 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:59.973630 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.973577 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:59.975736 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.975720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:56:59.975736 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.975732 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:59.975863 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.975749 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:59.976444 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.976431 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:59.976444 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.976436 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:59.976569 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.976454 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:59.976569 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.976460 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:59.976569 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.976467 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:59.976569 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.976474 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:59.978605 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.978592 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" Apr 21 03:56:59.978651 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.978615 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:56:59.979265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.979250 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:56:59.979372 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.979276 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:56:59.979372 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:56:59.979288 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:56:59.992853 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.992836 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-182.ec2.internal\" not found" node="ip-10-0-131-182.ec2.internal" Apr 21 03:56:59.996595 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:56:59.996580 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-182.ec2.internal\" not found" node="ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.064867 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.064841 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.076182 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.076158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/073b4f13c42b54bbac489c0b10c87cd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal\" (UID: \"073b4f13c42b54bbac489c0b10c87cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.076241 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.076189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/073b4f13c42b54bbac489c0b10c87cd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal\" (UID: \"073b4f13c42b54bbac489c0b10c87cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.076241 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.076211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/153f8e46061c7f7dc78983197c922203-config\") pod \"kube-apiserver-proxy-ip-10-0-131-182.ec2.internal\" (UID: \"153f8e46061c7f7dc78983197c922203\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.165480 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.165446 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.176749 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.176732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/073b4f13c42b54bbac489c0b10c87cd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal\" (UID: \"073b4f13c42b54bbac489c0b10c87cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.176799 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.176757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/073b4f13c42b54bbac489c0b10c87cd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal\" (UID: \"073b4f13c42b54bbac489c0b10c87cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.176799 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.176774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/153f8e46061c7f7dc78983197c922203-config\") pod \"kube-apiserver-proxy-ip-10-0-131-182.ec2.internal\" (UID: \"153f8e46061c7f7dc78983197c922203\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.176863 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.176817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/153f8e46061c7f7dc78983197c922203-config\") pod \"kube-apiserver-proxy-ip-10-0-131-182.ec2.internal\" (UID: \"153f8e46061c7f7dc78983197c922203\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.176863 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.176839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/073b4f13c42b54bbac489c0b10c87cd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal\" (UID: \"073b4f13c42b54bbac489c0b10c87cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.176863 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.176849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/073b4f13c42b54bbac489c0b10c87cd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal\" (UID: \"073b4f13c42b54bbac489c0b10c87cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.266243 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.266166 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.294623 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.294600 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.299425 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.299405 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" Apr 21 03:57:00.367084 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.367042 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.467577 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.467549 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.568202 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.568128 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.628390 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.628361 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:00.668610 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.668577 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.702159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.702128 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 03:57:00.702856 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.702304 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:00.702856 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.702331 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:00.769718 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.769681 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.775070 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.775052 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:00.778201 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.778171 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 03:51:59 +0000 UTC" deadline="2027-10-06 09:43:08.759565284 +0000 UTC" Apr 21 03:57:00.778201 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.778195 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12797h46m7.981372804s" Apr 21 03:57:00.778201 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:00.778174 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod153f8e46061c7f7dc78983197c922203.slice/crio-550b5d0ea84a7257db288d02013e32cbc419dd70cf6da2a787737a85f4889f84 WatchSource:0}: Error finding container 550b5d0ea84a7257db288d02013e32cbc419dd70cf6da2a787737a85f4889f84: Status 404 returned error can't find the container with id 550b5d0ea84a7257db288d02013e32cbc419dd70cf6da2a787737a85f4889f84 Apr 21 03:57:00.778679 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:00.778658 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073b4f13c42b54bbac489c0b10c87cd3.slice/crio-198dacb9488fca2c55f8f4b6f1f7b0c719ab39f64ede8e13ab10e863eef6ba65 WatchSource:0}: Error finding container 198dacb9488fca2c55f8f4b6f1f7b0c719ab39f64ede8e13ab10e863eef6ba65: Status 404 returned error can't find the container with id 198dacb9488fca2c55f8f4b6f1f7b0c719ab39f64ede8e13ab10e863eef6ba65 Apr 21 03:57:00.782739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.782723 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 03:57:00.798406 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.798385 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:00.817269 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.817245 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t5xvm" Apr 21 03:57:00.825675 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.825619 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t5xvm" Apr 21 03:57:00.870782 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.870752 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:00.874822 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.874773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" event={"ID":"073b4f13c42b54bbac489c0b10c87cd3","Type":"ContainerStarted","Data":"198dacb9488fca2c55f8f4b6f1f7b0c719ab39f64ede8e13ab10e863eef6ba65"} Apr 21 03:57:00.875699 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.875682 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" event={"ID":"153f8e46061c7f7dc78983197c922203","Type":"ContainerStarted","Data":"550b5d0ea84a7257db288d02013e32cbc419dd70cf6da2a787737a85f4889f84"} Apr 21 03:57:00.894985 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:00.894966 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:00.971665 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:00.971635 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-182.ec2.internal\" not found" Apr 21 03:57:01.063568 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.063541 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:01.075101 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.075079 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" Apr 21 03:57:01.086603 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.086544 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:01.087195 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.087183 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" Apr 21 03:57:01.095481 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.095461 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:01.755867 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.755674 2578 apiserver.go:52] "Watching apiserver" Apr 21 03:57:01.762063 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.762032 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 03:57:01.762517 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.762492 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk","openshift-cluster-node-tuning-operator/tuned-ng54w","openshift-dns/node-resolver-m7pww","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal","openshift-multus/multus-additional-cni-plugins-8v8nb","openshift-network-operator/iptables-alerter-64s86","openshift-ovn-kubernetes/ovnkube-node-9ssvt","kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal","openshift-image-registry/node-ca-9j76c","openshift-multus/multus-bdjrx","openshift-multus/network-metrics-daemon-wcnkn","openshift-network-diagnostics/network-check-target-sql8b","kube-system/konnectivity-agent-kqnd2"] Apr 21 03:57:01.765545 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.765525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.767596 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.767537 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:01.767596 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.767587 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dr94c\"" Apr 21 03:57:01.767772 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.767616 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 03:57:01.767772 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.767718 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:01.769855 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.769835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.769949 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.769932 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.771691 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.771628 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 03:57:01.771826 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.771704 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 03:57:01.771928 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.771908 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 03:57:01.771998 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.771920 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 03:57:01.771998 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.771961 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 03:57:01.772123 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.772019 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 03:57:01.772191 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.772140 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8qztf\"" Apr 21 03:57:01.772276 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.772253 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.772435 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.772331 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6zlb8\"" Apr 21 03:57:01.774227 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.773993 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 03:57:01.774227 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.774089 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-765kf\"" Apr 21 03:57:01.774884 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.774866 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 03:57:01.774970 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.774909 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 03:57:01.776249 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.776219 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.778363 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.778342 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 03:57:01.778713 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.778685 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 03:57:01.779820 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.778968 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 03:57:01.779820 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.779214 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.779820 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.779451 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-lwmrt\"" Apr 21 03:57:01.779820 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.779547 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 03:57:01.779820 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.779677 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 03:57:01.780669 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.780444 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 03:57:01.781356 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.781190 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 03:57:01.781356 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.781335 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rvcl2\"" Apr 21 03:57:01.781500 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.781388 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 03:57:01.782275 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.782239 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.784323 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.784285 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 03:57:01.784415 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.784365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 03:57:01.784415 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.784404 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 03:57:01.784525 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.784511 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kkx9s\"" Apr 21 03:57:01.784790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.784771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.785589 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.785684 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785603 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovnkube-script-lib\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.785684 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785633 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-daemon-config\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.785684 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-multus-certs\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.785684 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785679 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvdg\" (UniqueName: \"kubernetes.io/projected/7ebaea07-99ce-462a-8ed5-d99c06d6417e-kube-api-access-bfvdg\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.785878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48171444-16ad-44d3-adcd-dbc651bb6b7e-hosts-file\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.785878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-device-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.785878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovnkube-config\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.785878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-env-overrides\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.785878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785816 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-system-cni-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.785878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ebaea07-99ce-462a-8ed5-d99c06d6417e-cni-binary-copy\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.785878 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-kubelet\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.786159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.785945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-hostroot\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.786159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.786159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-etc-selinux\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.786159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-systemd-units\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.786159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-slash\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.786159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-os-release\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.786159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48171444-16ad-44d3-adcd-dbc651bb6b7e-tmp-dir\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.786489 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h555s\" (UniqueName: \"kubernetes.io/projected/48171444-16ad-44d3-adcd-dbc651bb6b7e-kube-api-access-h555s\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.786489 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-systemd\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.786489 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786262 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f24fb\" (UniqueName: \"kubernetes.io/projected/29783e2f-7e33-4e0b-a972-c774556775ce-kube-api-access-f24fb\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.786489 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-k8s-cni-cncf-io\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.786489 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786466 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-conf-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.786489 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-etc-kubernetes\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-socket-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786544 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-sys-fs\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786615 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-log-socket\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29783e2f-7e33-4e0b-a972-c774556775ce-host-slash\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786669 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-registration-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-var-lib-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-node-log\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.786790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-cni-bin\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-cni-netd\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknjt\" (UniqueName: \"kubernetes.io/projected/2d01920e-91d4-4f4c-b55b-21442ffc88c5-kube-api-access-kknjt\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786865 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/29783e2f-7e33-4e0b-a972-c774556775ce-iptables-alerter-script\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovn-node-metrics-cert\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-cni-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-socket-dir-parent\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.786980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-netns\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flfdm\" (UniqueName: \"kubernetes.io/projected/39b234de-dba0-44dc-8b51-1e777c24a972-kube-api-access-flfdm\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-kubelet\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-etc-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787070 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787082 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-cnibin\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-cni-bin\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-cni-multus\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-run-netns\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787197 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-ovn\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.787917 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787330 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:01.787917 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.787407 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vbnx2\"" Apr 21 03:57:01.789302 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.789282 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:01.789413 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:01.789377 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:01.789413 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.789283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:01.789526 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:01.789440 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:01.791733 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.791709 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:01.793737 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.793717 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 03:57:01.793821 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.793762 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8d2pz\"" Apr 21 03:57:01.794004 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.793989 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 03:57:01.815777 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.815757 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:01.826272 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.826244 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:00 +0000 UTC" deadline="2028-01-29 05:43:31.809107676 +0000 UTC" Apr 21 03:57:01.826399 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.826270 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15553h46m29.982840948s" Apr 21 03:57:01.875759 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.875735 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 03:57:01.887636 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvdg\" (UniqueName: \"kubernetes.io/projected/7ebaea07-99ce-462a-8ed5-d99c06d6417e-kube-api-access-bfvdg\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.887636 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48171444-16ad-44d3-adcd-dbc651bb6b7e-tmp-dir\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.887862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flfdm\" (UniqueName: \"kubernetes.io/projected/39b234de-dba0-44dc-8b51-1e777c24a972-kube-api-access-flfdm\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.887862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-kubelet\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.887963 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-etc-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.887963 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-cnibin\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.887963 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-cni-bin\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.887963 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-etc-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.887963 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-kubelet\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887967 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-cnibin\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.887960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xd82\" (UniqueName: \"kubernetes.io/projected/db466d44-e778-45ee-935c-04ea6c071763-kube-api-access-8xd82\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-cni-bin\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-ovn\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-ovn\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovnkube-script-lib\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48171444-16ad-44d3-adcd-dbc651bb6b7e-tmp-dir\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-multus-certs\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888122 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48171444-16ad-44d3-adcd-dbc651bb6b7e-hosts-file\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovnkube-config\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-multus-certs\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888194 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888209 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-env-overrides\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48171444-16ad-44d3-adcd-dbc651bb6b7e-hosts-file\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ebaea07-99ce-462a-8ed5-d99c06d6417e-cni-binary-copy\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-kubelet\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-hostroot\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-kubelet\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysconfig\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-hostroot\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b9475f5-b0ae-474d-b51c-d5a1efe76899-host\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888605 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-env-overrides\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-sys\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-host\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqn4\" (UniqueName: \"kubernetes.io/projected/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-kube-api-access-4vqn4\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.888842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovnkube-script-lib\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-sys-fs\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-log-socket\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-os-release\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888825 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovnkube-config\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-sys-fs\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-log-socket\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b9475f5-b0ae-474d-b51c-d5a1efe76899-serviceca\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888910 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a68bd18b-f831-4da9-b0f4-303529c0c0ca-agent-certs\") pod \"konnectivity-agent-kqnd2\" (UID: \"a68bd18b-f831-4da9-b0f4-303529c0c0ca\") " pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-var-lib-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-cni-netd\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kknjt\" (UniqueName: \"kubernetes.io/projected/2d01920e-91d4-4f4c-b55b-21442ffc88c5-kube-api-access-kknjt\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ebaea07-99ce-462a-8ed5-d99c06d6417e-cni-binary-copy\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-var-lib-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.888986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovn-node-metrics-cert\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-cni-netd\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-netns\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.889554 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889054 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889078 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-run\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a68bd18b-f831-4da9-b0f4-303529c0c0ca-konnectivity-ca\") pod \"konnectivity-agent-kqnd2\" (UID: \"a68bd18b-f831-4da9-b0f4-303529c0c0ca\") " pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889121 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-netns\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h555s\" (UniqueName: \"kubernetes.io/projected/48171444-16ad-44d3-adcd-dbc651bb6b7e-kube-api-access-h555s\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889190 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-cni-multus\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-openvswitch\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysctl-d\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-run-netns\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889245 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-var-lib-cni-multus\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889252 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-daemon-config\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-systemd\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889324 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-run-netns\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-device-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-system-cni-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889419 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.890250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-device-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889445 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97dfk\" (UniqueName: \"kubernetes.io/projected/4b9475f5-b0ae-474d-b51c-d5a1efe76899-kube-api-access-97dfk\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-etc-selinux\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-system-cni-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-systemd-units\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-etc-selinux\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-slash\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889544 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-systemd-units\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-os-release\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-slash\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889575 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-lib-modules\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-systemd\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-os-release\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f24fb\" (UniqueName: \"kubernetes.io/projected/29783e2f-7e33-4e0b-a972-c774556775ce-kube-api-access-f24fb\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889665 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-run-systemd\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-k8s-cni-cncf-io\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-conf-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889756 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-host-run-k8s-cni-cncf-io\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.890832 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-conf-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-etc-kubernetes\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-etc-kubernetes\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-cnibin\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-var-lib-kubelet\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-socket-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29783e2f-7e33-4e0b-a972-c774556775ce-host-slash\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889962 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysctl-conf\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.889985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-tuned\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-registration-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890016 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-socket-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890016 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29783e2f-7e33-4e0b-a972-c774556775ce-host-slash\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-node-log\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/39b234de-dba0-44dc-8b51-1e777c24a972-registration-dir\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-cni-bin\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-node-log\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.891381 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/29783e2f-7e33-4e0b-a972-c774556775ce-iptables-alerter-script\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-cni-bin\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d01920e-91d4-4f4c-b55b-21442ffc88c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-daemon-config\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-socket-dir-parent\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890182 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-socket-dir-parent\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-cni-binary-copy\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fdtz\" (UniqueName: \"kubernetes.io/projected/88564462-797f-416f-b56b-0e31e0156815-kube-api-access-6fdtz\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-cni-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890298 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-system-cni-dir\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ebaea07-99ce-462a-8ed5-d99c06d6417e-multus-cni-dir\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-modprobe-d\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-kubernetes\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-tmp\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.890455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:01.891862 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.891148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/29783e2f-7e33-4e0b-a972-c774556775ce-iptables-alerter-script\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.892266 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.892133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d01920e-91d4-4f4c-b55b-21442ffc88c5-ovn-node-metrics-cert\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.896588 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.896507 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flfdm\" (UniqueName: \"kubernetes.io/projected/39b234de-dba0-44dc-8b51-1e777c24a972-kube-api-access-flfdm\") pod \"aws-ebs-csi-driver-node-s9pdk\" (UID: \"39b234de-dba0-44dc-8b51-1e777c24a972\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:01.896731 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.896639 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f24fb\" (UniqueName: \"kubernetes.io/projected/29783e2f-7e33-4e0b-a972-c774556775ce-kube-api-access-f24fb\") pod \"iptables-alerter-64s86\" (UID: \"29783e2f-7e33-4e0b-a972-c774556775ce\") " pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:01.896731 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.896691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvdg\" (UniqueName: \"kubernetes.io/projected/7ebaea07-99ce-462a-8ed5-d99c06d6417e-kube-api-access-bfvdg\") pod \"multus-bdjrx\" (UID: \"7ebaea07-99ce-462a-8ed5-d99c06d6417e\") " pod="openshift-multus/multus-bdjrx" Apr 21 03:57:01.896731 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.896688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h555s\" (UniqueName: \"kubernetes.io/projected/48171444-16ad-44d3-adcd-dbc651bb6b7e-kube-api-access-h555s\") pod \"node-resolver-m7pww\" (UID: \"48171444-16ad-44d3-adcd-dbc651bb6b7e\") " pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:01.896920 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.896899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknjt\" (UniqueName: \"kubernetes.io/projected/2d01920e-91d4-4f4c-b55b-21442ffc88c5-kube-api-access-kknjt\") pod \"ovnkube-node-9ssvt\" (UID: \"2d01920e-91d4-4f4c-b55b-21442ffc88c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:01.991149 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b9475f5-b0ae-474d-b51c-d5a1efe76899-host\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.991149 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-sys\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991180 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-host\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqn4\" (UniqueName: \"kubernetes.io/projected/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-kube-api-access-4vqn4\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-os-release\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991263 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-sys\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b9475f5-b0ae-474d-b51c-d5a1efe76899-host\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-host\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b9475f5-b0ae-474d-b51c-d5a1efe76899-serviceca\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a68bd18b-f831-4da9-b0f4-303529c0c0ca-agent-certs\") pod \"konnectivity-agent-kqnd2\" (UID: \"a68bd18b-f831-4da9-b0f4-303529c0c0ca\") " pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-os-release\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.991409 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-run\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a68bd18b-f831-4da9-b0f4-303529c0c0ca-konnectivity-ca\") pod \"konnectivity-agent-kqnd2\" (UID: \"a68bd18b-f831-4da9-b0f4-303529c0c0ca\") " pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysctl-d\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-systemd\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991504 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-run\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97dfk\" (UniqueName: \"kubernetes.io/projected/4b9475f5-b0ae-474d-b51c-d5a1efe76899-kube-api-access-97dfk\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-lib-modules\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991659 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysctl-d\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-lib-modules\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991687 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-systemd\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-cnibin\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-var-lib-kubelet\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysctl-conf\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-cnibin\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991791 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b9475f5-b0ae-474d-b51c-d5a1efe76899-serviceca\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-tuned\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.991894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-var-lib-kubelet\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-cni-binary-copy\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fdtz\" (UniqueName: \"kubernetes.io/projected/88564462-797f-416f-b56b-0e31e0156815-kube-api-access-6fdtz\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-system-cni-dir\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysctl-conf\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-modprobe-d\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-kubernetes\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992003 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db466d44-e778-45ee-935c-04ea6c071763-system-cni-dir\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992051 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-tmp\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xd82\" (UniqueName: \"kubernetes.io/projected/db466d44-e778-45ee-935c-04ea6c071763-kube-api-access-8xd82\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-modprobe-d\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a68bd18b-f831-4da9-b0f4-303529c0c0ca-konnectivity-ca\") pod \"konnectivity-agent-kqnd2\" (UID: \"a68bd18b-f831-4da9-b0f4-303529c0c0ca\") " pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992209 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysconfig\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.991956 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.992716 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:01.992322 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:01.993608 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-cni-binary-copy\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.993608 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:01.992429 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs podName:88564462-797f-416f-b56b-0e31e0156815 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:02.492374904 +0000 UTC m=+3.059345379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs") pod "network-metrics-daemon-wcnkn" (UID: "88564462-797f-416f-b56b-0e31e0156815") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:01.993608 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-kubernetes\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.993608 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-sysconfig\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.993608 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.992721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/db466d44-e778-45ee-935c-04ea6c071763-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:01.994505 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.994462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a68bd18b-f831-4da9-b0f4-303529c0c0ca-agent-certs\") pod \"konnectivity-agent-kqnd2\" (UID: \"a68bd18b-f831-4da9-b0f4-303529c0c0ca\") " pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:01.994505 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.994471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-tmp\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:01.994676 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:01.994509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-etc-tuned\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:02.000492 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.000466 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:02.000492 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.000492 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:02.000668 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.000508 2578 projected.go:194] Error preparing data for projected volume kube-api-access-hxxz5 for pod openshift-network-diagnostics/network-check-target-sql8b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:02.000668 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.000597 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5 podName:e498c7e8-3ee2-49ce-8ccf-9a86e869f003 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:02.500578652 +0000 UTC m=+3.067549141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hxxz5" (UniqueName: "kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5") pod "network-check-target-sql8b" (UID: "e498c7e8-3ee2-49ce-8ccf-9a86e869f003") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:02.002790 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.002760 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqn4\" (UniqueName: \"kubernetes.io/projected/9a8b2ae1-8e68-4c2d-a316-6fc7547a3812-kube-api-access-4vqn4\") pod \"tuned-ng54w\" (UID: \"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812\") " pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:02.002908 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.002883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xd82\" (UniqueName: \"kubernetes.io/projected/db466d44-e778-45ee-935c-04ea6c071763-kube-api-access-8xd82\") pod \"multus-additional-cni-plugins-8v8nb\" (UID: \"db466d44-e778-45ee-935c-04ea6c071763\") " pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:02.004226 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.004202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fdtz\" (UniqueName: \"kubernetes.io/projected/88564462-797f-416f-b56b-0e31e0156815-kube-api-access-6fdtz\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:02.004424 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.004404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97dfk\" (UniqueName: \"kubernetes.io/projected/4b9475f5-b0ae-474d-b51c-d5a1efe76899-kube-api-access-97dfk\") pod \"node-ca-9j76c\" (UID: \"4b9475f5-b0ae-474d-b51c-d5a1efe76899\") " pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:02.078605 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.078535 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-64s86" Apr 21 03:57:02.087469 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.087441 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bdjrx" Apr 21 03:57:02.096460 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.096435 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m7pww" Apr 21 03:57:02.102037 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.102017 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" Apr 21 03:57:02.108691 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.108671 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:02.114270 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.114247 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" Apr 21 03:57:02.120997 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.120977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9j76c" Apr 21 03:57:02.126676 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.126654 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ng54w" Apr 21 03:57:02.134972 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.132661 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:02.461765 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:02.461737 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29783e2f_7e33_4e0b_a972_c774556775ce.slice/crio-e23744a814c8589e9eb192f48f39194b7b7197ba529f8ff3522d3b0488c71962 WatchSource:0}: Error finding container e23744a814c8589e9eb192f48f39194b7b7197ba529f8ff3522d3b0488c71962: Status 404 returned error can't find the container with id e23744a814c8589e9eb192f48f39194b7b7197ba529f8ff3522d3b0488c71962 Apr 21 03:57:02.466688 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:02.466642 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ebaea07_99ce_462a_8ed5_d99c06d6417e.slice/crio-337e30dd4a1a492773912a10b418b7c78b6594d85f44b1b09ace632d28190fb8 WatchSource:0}: Error finding container 337e30dd4a1a492773912a10b418b7c78b6594d85f44b1b09ace632d28190fb8: Status 404 returned error can't find the container with id 337e30dd4a1a492773912a10b418b7c78b6594d85f44b1b09ace632d28190fb8 Apr 21 03:57:02.468539 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:02.468516 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8b2ae1_8e68_4c2d_a316_6fc7547a3812.slice/crio-6405cf01e42d8b304874580d695b51234ea053bdaa8e71537ca4b6c0f2119e5d WatchSource:0}: Error finding container 6405cf01e42d8b304874580d695b51234ea053bdaa8e71537ca4b6c0f2119e5d: Status 404 returned error can't find the container with id 6405cf01e42d8b304874580d695b51234ea053bdaa8e71537ca4b6c0f2119e5d Apr 21 03:57:02.469459 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:02.469437 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb466d44_e778_45ee_935c_04ea6c071763.slice/crio-77a5eaeec883669277e95655810bdd6d43922fe59f77a126f9d2a28673b59f3c WatchSource:0}: Error finding container 77a5eaeec883669277e95655810bdd6d43922fe59f77a126f9d2a28673b59f3c: Status 404 returned error can't find the container with id 77a5eaeec883669277e95655810bdd6d43922fe59f77a126f9d2a28673b59f3c Apr 21 03:57:02.470778 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:02.470641 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48171444_16ad_44d3_adcd_dbc651bb6b7e.slice/crio-83f625af25f61c20208c9e7a850f2d27bec7fb59394b7b89f47b3f5b15891d42 WatchSource:0}: Error finding container 83f625af25f61c20208c9e7a850f2d27bec7fb59394b7b89f47b3f5b15891d42: Status 404 returned error can't find the container with id 83f625af25f61c20208c9e7a850f2d27bec7fb59394b7b89f47b3f5b15891d42 Apr 21 03:57:02.471487 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:02.471395 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b9475f5_b0ae_474d_b51c_d5a1efe76899.slice/crio-2ebf530f31f0fcd5bb5381396bc8fab0b7591f18933865573b1830b40c8608e0 WatchSource:0}: Error finding container 2ebf530f31f0fcd5bb5381396bc8fab0b7591f18933865573b1830b40c8608e0: Status 404 returned error can't find the container with id 2ebf530f31f0fcd5bb5381396bc8fab0b7591f18933865573b1830b40c8608e0 Apr 21 03:57:02.472380 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:02.472360 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d01920e_91d4_4f4c_b55b_21442ffc88c5.slice/crio-b3f0de271ae607ea5791bf5adadc836ffd16ffc35cff83ab26574a1f420b7f95 WatchSource:0}: Error finding container b3f0de271ae607ea5791bf5adadc836ffd16ffc35cff83ab26574a1f420b7f95: Status 404 returned error can't find the container with id b3f0de271ae607ea5791bf5adadc836ffd16ffc35cff83ab26574a1f420b7f95 Apr 21 03:57:02.473496 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:02.473221 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b234de_dba0_44dc_8b51_1e777c24a972.slice/crio-ae72fbd0c97c1768e2b239b3c7b5ac38cf4683286440175ee2308ff2631f09e0 WatchSource:0}: Error finding container ae72fbd0c97c1768e2b239b3c7b5ac38cf4683286440175ee2308ff2631f09e0: Status 404 returned error can't find the container with id ae72fbd0c97c1768e2b239b3c7b5ac38cf4683286440175ee2308ff2631f09e0 Apr 21 03:57:02.495145 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.495120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:02.495272 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.495255 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:02.495343 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.495331 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs podName:88564462-797f-416f-b56b-0e31e0156815 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:03.495296648 +0000 UTC m=+4.062267131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs") pod "network-metrics-daemon-wcnkn" (UID: "88564462-797f-416f-b56b-0e31e0156815") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:02.595970 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.595918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:02.596169 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.596049 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:02.596169 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.596066 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:02.596169 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.596079 2578 projected.go:194] Error preparing data for projected volume kube-api-access-hxxz5 for pod openshift-network-diagnostics/network-check-target-sql8b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:02.596169 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:02.596137 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5 podName:e498c7e8-3ee2-49ce-8ccf-9a86e869f003 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:03.596122963 +0000 UTC m=+4.163093438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxxz5" (UniqueName: "kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5") pod "network-check-target-sql8b" (UID: "e498c7e8-3ee2-49ce-8ccf-9a86e869f003") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:02.826737 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.826603 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:00 +0000 UTC" deadline="2028-01-07 04:23:29.86806846 +0000 UTC" Apr 21 03:57:02.826737 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.826642 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15024h26m27.041430386s" Apr 21 03:57:02.882420 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.882118 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerStarted","Data":"77a5eaeec883669277e95655810bdd6d43922fe59f77a126f9d2a28673b59f3c"} Apr 21 03:57:02.888214 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.888145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ng54w" event={"ID":"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812","Type":"ContainerStarted","Data":"6405cf01e42d8b304874580d695b51234ea053bdaa8e71537ca4b6c0f2119e5d"} Apr 21 03:57:02.893010 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.892979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdjrx" event={"ID":"7ebaea07-99ce-462a-8ed5-d99c06d6417e","Type":"ContainerStarted","Data":"337e30dd4a1a492773912a10b418b7c78b6594d85f44b1b09ace632d28190fb8"} Apr 21 03:57:02.895798 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.895692 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-64s86" event={"ID":"29783e2f-7e33-4e0b-a972-c774556775ce","Type":"ContainerStarted","Data":"e23744a814c8589e9eb192f48f39194b7b7197ba529f8ff3522d3b0488c71962"} Apr 21 03:57:02.900108 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.899207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" event={"ID":"153f8e46061c7f7dc78983197c922203","Type":"ContainerStarted","Data":"3a17bd1c5388a018dbeb3a5671184a1840291f7287bfeb0a7775cb4bf641a2f9"} Apr 21 03:57:02.908071 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.908010 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" event={"ID":"39b234de-dba0-44dc-8b51-1e777c24a972","Type":"ContainerStarted","Data":"ae72fbd0c97c1768e2b239b3c7b5ac38cf4683286440175ee2308ff2631f09e0"} Apr 21 03:57:02.912066 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.912039 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m7pww" event={"ID":"48171444-16ad-44d3-adcd-dbc651bb6b7e","Type":"ContainerStarted","Data":"83f625af25f61c20208c9e7a850f2d27bec7fb59394b7b89f47b3f5b15891d42"} Apr 21 03:57:02.916900 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.916874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9j76c" event={"ID":"4b9475f5-b0ae-474d-b51c-d5a1efe76899","Type":"ContainerStarted","Data":"2ebf530f31f0fcd5bb5381396bc8fab0b7591f18933865573b1830b40c8608e0"} Apr 21 03:57:02.917467 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.917407 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-182.ec2.internal" podStartSLOduration=1.917384879 podStartE2EDuration="1.917384879s" podCreationTimestamp="2026-04-21 03:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:02.916511387 +0000 UTC m=+3.483481880" watchObservedRunningTime="2026-04-21 03:57:02.917384879 +0000 UTC m=+3.484355371" Apr 21 03:57:02.922515 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.922489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kqnd2" event={"ID":"a68bd18b-f831-4da9-b0f4-303529c0c0ca","Type":"ContainerStarted","Data":"9ee19b3afa02b702364220f8f127e99c1c43f7b57f41071e9803328c6f965358"} Apr 21 03:57:02.929479 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:02.929452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"b3f0de271ae607ea5791bf5adadc836ffd16ffc35cff83ab26574a1f420b7f95"} Apr 21 03:57:03.501898 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:03.501780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:03.502056 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:03.501954 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:03.502056 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:03.502040 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs podName:88564462-797f-416f-b56b-0e31e0156815 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:05.50201999 +0000 UTC m=+6.068990475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs") pod "network-metrics-daemon-wcnkn" (UID: "88564462-797f-416f-b56b-0e31e0156815") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:03.602440 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:03.602402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:03.602629 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:03.602563 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:03.602629 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:03.602582 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:03.602629 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:03.602595 2578 projected.go:194] Error preparing data for projected volume kube-api-access-hxxz5 for pod openshift-network-diagnostics/network-check-target-sql8b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:03.602888 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:03.602654 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5 podName:e498c7e8-3ee2-49ce-8ccf-9a86e869f003 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:05.602634927 +0000 UTC m=+6.169605414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxxz5" (UniqueName: "kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5") pod "network-check-target-sql8b" (UID: "e498c7e8-3ee2-49ce-8ccf-9a86e869f003") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:03.790533 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:03.790447 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:03.885670 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:03.884494 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:03.885670 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:03.884567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:03.885670 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:03.884690 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:03.885670 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:03.885208 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:03.942259 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:03.942216 2578 generic.go:358] "Generic (PLEG): container finished" podID="073b4f13c42b54bbac489c0b10c87cd3" containerID="350d039a017cd60eb7264501f7c3c0f5ecb359691d97733b1c12e494b8068673" exitCode=0 Apr 21 03:57:03.943437 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:03.943183 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" event={"ID":"073b4f13c42b54bbac489c0b10c87cd3","Type":"ContainerDied","Data":"350d039a017cd60eb7264501f7c3c0f5ecb359691d97733b1c12e494b8068673"} Apr 21 03:57:04.949332 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:04.949260 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" event={"ID":"073b4f13c42b54bbac489c0b10c87cd3","Type":"ContainerStarted","Data":"367e749d0c6f295a3485a04bdd78fe1f0185ec4fce2ce5e67cede04c13100722"} Apr 21 03:57:05.519669 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:05.519635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:05.519857 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:05.519807 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:05.519924 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:05.519890 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs podName:88564462-797f-416f-b56b-0e31e0156815 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:09.519869664 +0000 UTC m=+10.086840148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs") pod "network-metrics-daemon-wcnkn" (UID: "88564462-797f-416f-b56b-0e31e0156815") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:05.620662 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:05.620020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:05.620662 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:05.620197 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:05.620662 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:05.620217 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:05.620662 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:05.620230 2578 projected.go:194] Error preparing data for projected volume kube-api-access-hxxz5 for pod openshift-network-diagnostics/network-check-target-sql8b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:05.620662 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:05.620297 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5 podName:e498c7e8-3ee2-49ce-8ccf-9a86e869f003 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:09.620277664 +0000 UTC m=+10.187248154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxxz5" (UniqueName: "kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5") pod "network-check-target-sql8b" (UID: "e498c7e8-3ee2-49ce-8ccf-9a86e869f003") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:05.872662 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:05.872570 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:05.872662 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:05.872613 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:05.872873 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:05.872711 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:05.872873 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:05.872802 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:07.872961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:07.872930 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:07.873390 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:07.873073 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:07.873729 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:07.873589 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:07.873729 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:07.873687 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:09.555456 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:09.555401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:09.555931 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:09.555548 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:09.555931 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:09.555617 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs podName:88564462-797f-416f-b56b-0e31e0156815 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:17.555596902 +0000 UTC m=+18.122567386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs") pod "network-metrics-daemon-wcnkn" (UID: "88564462-797f-416f-b56b-0e31e0156815") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:09.656043 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:09.655984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:09.656214 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:09.656149 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:09.656214 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:09.656173 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:09.656214 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:09.656183 2578 projected.go:194] Error preparing data for projected volume kube-api-access-hxxz5 for pod openshift-network-diagnostics/network-check-target-sql8b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:09.656214 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:09.656239 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5 podName:e498c7e8-3ee2-49ce-8ccf-9a86e869f003 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:17.656223157 +0000 UTC m=+18.223193627 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxxz5" (UniqueName: "kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5") pod "network-check-target-sql8b" (UID: "e498c7e8-3ee2-49ce-8ccf-9a86e869f003") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:09.873842 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:09.873758 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:09.873991 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:09.873758 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:09.873991 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:09.873889 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:09.874100 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:09.874031 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:11.872223 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:11.872189 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:11.872703 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:11.872189 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:11.872703 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:11.872342 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:11.872703 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:11.872492 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:13.872752 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:13.872715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:13.872752 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:13.872752 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:13.873259 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:13.872852 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:13.873259 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:13.872952 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:15.876023 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:15.875801 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:15.876478 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:15.875830 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:15.876478 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:15.876113 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:15.876478 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:15.876194 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:17.616183 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:17.616145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:17.616576 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:17.616254 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:17.616576 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:17.616323 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs podName:88564462-797f-416f-b56b-0e31e0156815 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:33.61629304 +0000 UTC m=+34.183263515 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs") pod "network-metrics-daemon-wcnkn" (UID: "88564462-797f-416f-b56b-0e31e0156815") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:17.717019 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:17.716971 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:17.717185 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:17.717148 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:17.717185 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:17.717171 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:17.717185 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:17.717181 2578 projected.go:194] Error preparing data for projected volume kube-api-access-hxxz5 for pod openshift-network-diagnostics/network-check-target-sql8b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:17.717355 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:17.717235 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5 podName:e498c7e8-3ee2-49ce-8ccf-9a86e869f003 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:33.717221449 +0000 UTC m=+34.284191923 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxxz5" (UniqueName: "kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5") pod "network-check-target-sql8b" (UID: "e498c7e8-3ee2-49ce-8ccf-9a86e869f003") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:17.872894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:17.872812 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:17.873025 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:17.872932 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:17.873025 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:17.872994 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:17.873127 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:17.873103 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:19.875301 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.875074 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:19.876105 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.875074 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:19.876105 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:19.875364 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:19.876105 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:19.875420 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:19.975757 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.975720 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m7pww" event={"ID":"48171444-16ad-44d3-adcd-dbc651bb6b7e","Type":"ContainerStarted","Data":"b5c8b602f3afeab6c04883d94c8d13e8cf5f6804b26be5cc0079d8dcb28aca59"} Apr 21 03:57:19.977188 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.977156 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9j76c" event={"ID":"4b9475f5-b0ae-474d-b51c-d5a1efe76899","Type":"ContainerStarted","Data":"752982c9609fd283858941f1a7d1d1d23584bf96d8c23059106eebd31192e880"} Apr 21 03:57:19.978504 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.978465 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kqnd2" event={"ID":"a68bd18b-f831-4da9-b0f4-303529c0c0ca","Type":"ContainerStarted","Data":"a4b0b439f115b2449af8c617539d16e41d3f25de9d3c18f01c34fc6a070e3f5e"} Apr 21 03:57:19.980391 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.980373 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 03:57:19.980730 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.980705 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d01920e-91d4-4f4c-b55b-21442ffc88c5" containerID="ab7e2a7c5fe869ee9357d5a44065ed6f3eefbd580c65981818fff35b59d226ad" exitCode=1 Apr 21 03:57:19.980836 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.980736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"44fab5f5829f77619a1112c6df88398f2489b877501246bc88c0dff83177ab90"} Apr 21 03:57:19.980836 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.980774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"230b52e0be0e73648cad0f0ba7c5a46e9035b4b6790c0080c6d1402d65c6d07e"} Apr 21 03:57:19.980836 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.980788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerDied","Data":"ab7e2a7c5fe869ee9357d5a44065ed6f3eefbd580c65981818fff35b59d226ad"} Apr 21 03:57:19.980836 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.980804 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"c121221f4c475990edf66bc05553d8a0914a978af0d8b44a379518d7b49e4847"} Apr 21 03:57:19.982243 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.982221 2578 generic.go:358] "Generic (PLEG): container finished" podID="db466d44-e778-45ee-935c-04ea6c071763" containerID="2f8a22a990d1ae645e2a509174b3cdb2073a4d5d69ba15870f2bf5e39ca84e01" exitCode=0 Apr 21 03:57:19.982347 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.982291 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerDied","Data":"2f8a22a990d1ae645e2a509174b3cdb2073a4d5d69ba15870f2bf5e39ca84e01"} Apr 21 03:57:19.983728 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.983708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ng54w" event={"ID":"9a8b2ae1-8e68-4c2d-a316-6fc7547a3812","Type":"ContainerStarted","Data":"5c4a3d614f4119a5a079c212575e896d06a637a9cfd1d07e06687c999f38725b"} Apr 21 03:57:19.985137 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.985103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdjrx" event={"ID":"7ebaea07-99ce-462a-8ed5-d99c06d6417e","Type":"ContainerStarted","Data":"7c7a3b0b17bd3d0b7e546ae70bd619be78fa360fd4f05db22c1a4d8ffcbe081d"} Apr 21 03:57:19.986416 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.986398 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" event={"ID":"39b234de-dba0-44dc-8b51-1e777c24a972","Type":"ContainerStarted","Data":"98a970cb4c44a04db8fa13ed85246caa2cc129cf4c3159be6060b678cba087b4"} Apr 21 03:57:19.993004 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.992968 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m7pww" podStartSLOduration=4.334687108 podStartE2EDuration="20.992958605s" podCreationTimestamp="2026-04-21 03:56:59 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.472246958 +0000 UTC m=+3.039217441" lastFinishedPulling="2026-04-21 03:57:19.130518467 +0000 UTC m=+19.697488938" observedRunningTime="2026-04-21 03:57:19.99271682 +0000 UTC m=+20.559687312" watchObservedRunningTime="2026-04-21 03:57:19.992958605 +0000 UTC m=+20.559929096" Apr 21 03:57:19.993102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:19.993046 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-182.ec2.internal" podStartSLOduration=18.993041797 podStartE2EDuration="18.993041797s" podCreationTimestamp="2026-04-21 03:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:04.965010434 +0000 UTC m=+5.531980928" watchObservedRunningTime="2026-04-21 03:57:19.993041797 +0000 UTC m=+20.560012306" Apr 21 03:57:20.028733 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.028678 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9j76c" podStartSLOduration=3.377403583 podStartE2EDuration="20.028664438s" podCreationTimestamp="2026-04-21 03:57:00 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.47359441 +0000 UTC m=+3.040564880" lastFinishedPulling="2026-04-21 03:57:19.124855257 +0000 UTC m=+19.691825735" observedRunningTime="2026-04-21 03:57:20.028202624 +0000 UTC m=+20.595173129" watchObservedRunningTime="2026-04-21 03:57:20.028664438 +0000 UTC m=+20.595634930" Apr 21 03:57:20.049550 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.049370 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ng54w" podStartSLOduration=3.392535961 podStartE2EDuration="20.049357106s" podCreationTimestamp="2026-04-21 03:57:00 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.4703678 +0000 UTC m=+3.037338270" lastFinishedPulling="2026-04-21 03:57:19.127188931 +0000 UTC m=+19.694159415" observedRunningTime="2026-04-21 03:57:20.049124658 +0000 UTC m=+20.616095149" watchObservedRunningTime="2026-04-21 03:57:20.049357106 +0000 UTC m=+20.616327598" Apr 21 03:57:20.064462 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.064419 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kqnd2" podStartSLOduration=7.79427436 podStartE2EDuration="20.064405413s" podCreationTimestamp="2026-04-21 03:57:00 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.476049896 +0000 UTC m=+3.043020381" lastFinishedPulling="2026-04-21 03:57:14.74618096 +0000 UTC m=+15.313151434" observedRunningTime="2026-04-21 03:57:20.063511821 +0000 UTC m=+20.630482312" watchObservedRunningTime="2026-04-21 03:57:20.064405413 +0000 UTC m=+20.631375906" Apr 21 03:57:20.083272 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.083220 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bdjrx" podStartSLOduration=4.408228109 podStartE2EDuration="21.08320668s" podCreationTimestamp="2026-04-21 03:56:59 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.468300121 +0000 UTC m=+3.035270591" lastFinishedPulling="2026-04-21 03:57:19.143278691 +0000 UTC m=+19.710249162" observedRunningTime="2026-04-21 03:57:20.08262572 +0000 UTC m=+20.649596214" watchObservedRunningTime="2026-04-21 03:57:20.08320668 +0000 UTC m=+20.650177174" Apr 21 03:57:20.681555 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.681524 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 03:57:20.851204 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.851113 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T03:57:20.681547991Z","UUID":"ec27d48d-d0b7-4f03-bce6-8d9a8ae6339b","Handler":null,"Name":"","Endpoint":""} Apr 21 03:57:20.852837 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.852808 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 03:57:20.852837 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.852843 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 03:57:20.990408 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.990356 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-64s86" event={"ID":"29783e2f-7e33-4e0b-a972-c774556775ce","Type":"ContainerStarted","Data":"51ffbb52370cad45f5a4bb3591b55e1797533daad02123819077be30fc448c8b"} Apr 21 03:57:20.992155 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.992128 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" event={"ID":"39b234de-dba0-44dc-8b51-1e777c24a972","Type":"ContainerStarted","Data":"38e37550e428c5ea643ec33039ed72fac6ea4802b909d4c50abbc941e0e62763"} Apr 21 03:57:20.994848 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.994823 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 03:57:20.995273 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.995238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"e64fea77e71484613a5f5210cb489d5cb7dfaddfc7373446ad1825dcbb2ddeaf"} Apr 21 03:57:20.995396 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:20.995282 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"a54bee2c935ee502894bcf32cfcab13730e5a18b25baf43a0940debec7cb8ee3"} Apr 21 03:57:21.004698 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:21.004651 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-64s86" podStartSLOduration=5.3439950849999995 podStartE2EDuration="22.004636248s" podCreationTimestamp="2026-04-21 03:56:59 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.464237029 +0000 UTC m=+3.031207514" lastFinishedPulling="2026-04-21 03:57:19.124878207 +0000 UTC m=+19.691848677" observedRunningTime="2026-04-21 03:57:21.004191392 +0000 UTC m=+21.571161885" watchObservedRunningTime="2026-04-21 03:57:21.004636248 +0000 UTC m=+21.571606743" Apr 21 03:57:21.873520 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:21.873478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:21.873726 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:21.873651 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:21.874229 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:21.874076 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:21.874229 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:21.874186 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:21.999536 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:21.999498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" event={"ID":"39b234de-dba0-44dc-8b51-1e777c24a972","Type":"ContainerStarted","Data":"a75f297ed3e1a34d42404cce468ba32a0c1a648b9dd7fb6483dab528dd0cc71f"} Apr 21 03:57:22.018682 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:22.018632 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s9pdk" podStartSLOduration=3.674273196 podStartE2EDuration="23.018617959s" podCreationTimestamp="2026-04-21 03:56:59 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.475440724 +0000 UTC m=+3.042411209" lastFinishedPulling="2026-04-21 03:57:21.819785493 +0000 UTC m=+22.386755972" observedRunningTime="2026-04-21 03:57:22.018056895 +0000 UTC m=+22.585027424" watchObservedRunningTime="2026-04-21 03:57:22.018617959 +0000 UTC m=+22.585588450" Apr 21 03:57:22.449927 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:22.449898 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:22.450426 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:22.450409 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:22.454159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:22.454134 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:22.454552 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:22.454536 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kqnd2" Apr 21 03:57:23.004424 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:23.004395 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 03:57:23.004977 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:23.004763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"6c5217a668adf96a1b2a73192465b787f43bb2818dd9a06aa3ad7c32d0306912"} Apr 21 03:57:23.872685 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:23.872650 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:23.872875 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:23.872698 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:23.872875 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:23.872790 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:23.873039 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:23.872929 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:23.917228 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:23.917195 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lqtd7"] Apr 21 03:57:23.942766 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:23.942721 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:23.942925 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:23.942817 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lqtd7" podUID="8ba63635-96d8-482b-a6ee-d309369ecee1" Apr 21 03:57:24.066052 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.066014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8ba63635-96d8-482b-a6ee-d309369ecee1-kubelet-config\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.066479 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.066084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.066479 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.066159 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8ba63635-96d8-482b-a6ee-d309369ecee1-dbus\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.167161 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.167133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8ba63635-96d8-482b-a6ee-d309369ecee1-dbus\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.167290 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.167208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8ba63635-96d8-482b-a6ee-d309369ecee1-kubelet-config\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.167290 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.167235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.167383 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.167343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8ba63635-96d8-482b-a6ee-d309369ecee1-kubelet-config\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.167383 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:24.167356 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:24.167444 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.167407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8ba63635-96d8-482b-a6ee-d309369ecee1-dbus\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.167444 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:24.167414 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret podName:8ba63635-96d8-482b-a6ee-d309369ecee1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:24.667398662 +0000 UTC m=+25.234369148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret") pod "global-pull-secret-syncer-lqtd7" (UID: "8ba63635-96d8-482b-a6ee-d309369ecee1") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:24.671135 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:24.670776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:24.671135 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:24.670913 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:24.671458 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:24.671155 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret podName:8ba63635-96d8-482b-a6ee-d309369ecee1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:25.671134196 +0000 UTC m=+26.238104681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret") pod "global-pull-secret-syncer-lqtd7" (UID: "8ba63635-96d8-482b-a6ee-d309369ecee1") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:25.011934 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.011909 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 03:57:25.012265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.012243 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"a7379e964668501106e6fcdffacee99017c98453d0ea446dd4da79da5d19cdd2"} Apr 21 03:57:25.012553 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.012531 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:25.012628 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.012562 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:25.012717 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.012700 2578 scope.go:117] "RemoveContainer" containerID="ab7e2a7c5fe869ee9357d5a44065ed6f3eefbd580c65981818fff35b59d226ad" Apr 21 03:57:25.013998 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.013977 2578 generic.go:358] "Generic (PLEG): container finished" podID="db466d44-e778-45ee-935c-04ea6c071763" containerID="5d42a1617e90a51f366750119748ff7926191c02d42f3e631622f89c5d0e1353" exitCode=0 Apr 21 03:57:25.014101 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.014021 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerDied","Data":"5d42a1617e90a51f366750119748ff7926191c02d42f3e631622f89c5d0e1353"} Apr 21 03:57:25.028797 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.028776 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:25.676962 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.676808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:25.677364 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:25.676946 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:25.677364 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:25.677051 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret podName:8ba63635-96d8-482b-a6ee-d309369ecee1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:27.677037462 +0000 UTC m=+28.244007932 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret") pod "global-pull-secret-syncer-lqtd7" (UID: "8ba63635-96d8-482b-a6ee-d309369ecee1") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:25.874850 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.874761 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:25.874850 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.874824 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:25.875045 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:25.874917 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:25.875282 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:25.875260 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:25.875397 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:25.875377 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:25.875467 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:25.875450 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lqtd7" podUID="8ba63635-96d8-482b-a6ee-d309369ecee1" Apr 21 03:57:26.019212 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.019188 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 03:57:26.019572 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.019544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" event={"ID":"2d01920e-91d4-4f4c-b55b-21442ffc88c5","Type":"ContainerStarted","Data":"830c88a24d32431604401ead812c63ab400ef7b7c89cca56731c67970a4d1d00"} Apr 21 03:57:26.019938 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.019919 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:26.021705 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.021675 2578 generic.go:358] "Generic (PLEG): container finished" podID="db466d44-e778-45ee-935c-04ea6c071763" containerID="ced8597880f898fb1bb0dd0674fe7e4600d4ceb8883c025dd089c0428472894d" exitCode=0 Apr 21 03:57:26.021788 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.021718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerDied","Data":"ced8597880f898fb1bb0dd0674fe7e4600d4ceb8883c025dd089c0428472894d"} Apr 21 03:57:26.035048 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.035016 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:26.049029 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.048984 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" podStartSLOduration=9.109375577 podStartE2EDuration="26.048967946s" podCreationTimestamp="2026-04-21 03:57:00 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.475054498 +0000 UTC m=+3.042024971" lastFinishedPulling="2026-04-21 03:57:19.414646864 +0000 UTC m=+19.981617340" observedRunningTime="2026-04-21 03:57:26.048938996 +0000 UTC m=+26.615909500" watchObservedRunningTime="2026-04-21 03:57:26.048967946 +0000 UTC m=+26.615938420" Apr 21 03:57:26.522022 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.521989 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wcnkn"] Apr 21 03:57:26.522180 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.522116 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:26.522261 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:26.522235 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:26.524894 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.524794 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sql8b"] Apr 21 03:57:26.525023 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.524981 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:26.525104 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:26.525082 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:26.525636 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.525580 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lqtd7"] Apr 21 03:57:26.525732 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:26.525672 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:26.525802 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:26.525747 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lqtd7" podUID="8ba63635-96d8-482b-a6ee-d309369ecee1" Apr 21 03:57:27.025552 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:27.025517 2578 generic.go:358] "Generic (PLEG): container finished" podID="db466d44-e778-45ee-935c-04ea6c071763" containerID="d18935cb2463b7022070a7654ca366d3b7f7f6fd98406d260e9109a13a9cb9ff" exitCode=0 Apr 21 03:57:27.026281 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:27.025561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerDied","Data":"d18935cb2463b7022070a7654ca366d3b7f7f6fd98406d260e9109a13a9cb9ff"} Apr 21 03:57:27.693579 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:27.693542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:27.693748 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:27.693662 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:27.693748 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:27.693714 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret podName:8ba63635-96d8-482b-a6ee-d309369ecee1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:31.693701026 +0000 UTC m=+32.260671496 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret") pod "global-pull-secret-syncer-lqtd7" (UID: "8ba63635-96d8-482b-a6ee-d309369ecee1") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:27.872844 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:27.872806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:27.872844 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:27.872832 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:27.873071 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:27.872925 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:27.873071 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:27.872955 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lqtd7" podUID="8ba63635-96d8-482b-a6ee-d309369ecee1" Apr 21 03:57:27.873071 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:27.873064 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:27.873225 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:27.873129 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:29.873105 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:29.873072 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:29.873712 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:29.873162 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:29.873712 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:29.873196 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:29.873712 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:29.873279 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:29.873712 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:29.873328 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:29.873712 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:29.873337 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lqtd7" podUID="8ba63635-96d8-482b-a6ee-d309369ecee1" Apr 21 03:57:31.723795 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:31.723588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:31.724228 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:31.723737 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:31.724228 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:31.723882 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret podName:8ba63635-96d8-482b-a6ee-d309369ecee1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:39.723864095 +0000 UTC m=+40.290834565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret") pod "global-pull-secret-syncer-lqtd7" (UID: "8ba63635-96d8-482b-a6ee-d309369ecee1") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:31.873106 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:31.873072 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:31.873357 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:31.873126 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:31.873357 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:31.873205 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lqtd7" podUID="8ba63635-96d8-482b-a6ee-d309369ecee1" Apr 21 03:57:31.873357 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:31.873322 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:31.873555 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:31.873447 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wcnkn" podUID="88564462-797f-416f-b56b-0e31e0156815" Apr 21 03:57:31.873555 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:31.873519 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sql8b" podUID="e498c7e8-3ee2-49ce-8ccf-9a86e869f003" Apr 21 03:57:32.259055 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.259024 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-182.ec2.internal" event="NodeReady" Apr 21 03:57:32.259337 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.259169 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 03:57:32.301044 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.301006 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7mb9p"] Apr 21 03:57:32.336205 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.336173 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zbrz9"] Apr 21 03:57:32.336396 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.336360 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.338423 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.338399 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 03:57:32.338423 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.338416 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lrpqk\"" Apr 21 03:57:32.338600 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.338418 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 03:57:32.360602 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.360577 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7mb9p"] Apr 21 03:57:32.360602 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.360603 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zbrz9"] Apr 21 03:57:32.360764 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.360716 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:32.362764 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.362735 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4vtdx\"" Apr 21 03:57:32.363112 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.363094 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 03:57:32.363200 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.363109 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 03:57:32.363200 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.363093 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 03:57:32.430408 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.430372 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86dacc78-49b6-4d77-b33d-e5f6f827d63e-config-volume\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.430408 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.430411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86dacc78-49b6-4d77-b33d-e5f6f827d63e-tmp-dir\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.430649 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.430432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.430649 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.430452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzc6z\" (UniqueName: \"kubernetes.io/projected/c3286e93-02a3-4094-a61d-5b8ba11a35d6-kube-api-access-hzc6z\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:32.430649 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.430585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:32.430783 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.430658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ffvk\" (UniqueName: \"kubernetes.io/projected/86dacc78-49b6-4d77-b33d-e5f6f827d63e-kube-api-access-7ffvk\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.531874 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.531790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:32.531874 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.531847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ffvk\" (UniqueName: \"kubernetes.io/projected/86dacc78-49b6-4d77-b33d-e5f6f827d63e-kube-api-access-7ffvk\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.532098 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.531889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86dacc78-49b6-4d77-b33d-e5f6f827d63e-config-volume\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.532098 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.531911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86dacc78-49b6-4d77-b33d-e5f6f827d63e-tmp-dir\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.532098 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.531935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.532098 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:32.531949 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:32.532098 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:32.532023 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert podName:c3286e93-02a3-4094-a61d-5b8ba11a35d6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:33.032002708 +0000 UTC m=+33.598973198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert") pod "ingress-canary-zbrz9" (UID: "c3286e93-02a3-4094-a61d-5b8ba11a35d6") : secret "canary-serving-cert" not found Apr 21 03:57:32.532098 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.531959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzc6z\" (UniqueName: \"kubernetes.io/projected/c3286e93-02a3-4094-a61d-5b8ba11a35d6-kube-api-access-hzc6z\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:32.532098 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:32.532063 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:32.532371 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:32.532116 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls podName:86dacc78-49b6-4d77-b33d-e5f6f827d63e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:33.0320994 +0000 UTC m=+33.599069883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls") pod "dns-default-7mb9p" (UID: "86dacc78-49b6-4d77-b33d-e5f6f827d63e") : secret "dns-default-metrics-tls" not found Apr 21 03:57:32.532371 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.532243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86dacc78-49b6-4d77-b33d-e5f6f827d63e-tmp-dir\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.532481 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.532460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86dacc78-49b6-4d77-b33d-e5f6f827d63e-config-volume\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.541506 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.541484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ffvk\" (UniqueName: \"kubernetes.io/projected/86dacc78-49b6-4d77-b33d-e5f6f827d63e-kube-api-access-7ffvk\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:32.541779 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:32.541758 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzc6z\" (UniqueName: \"kubernetes.io/projected/c3286e93-02a3-4094-a61d-5b8ba11a35d6-kube-api-access-hzc6z\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:33.035904 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.035872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:33.036294 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.035984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:33.036294 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.036005 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:33.036294 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.036098 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert podName:c3286e93-02a3-4094-a61d-5b8ba11a35d6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:34.036078587 +0000 UTC m=+34.603049076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert") pod "ingress-canary-zbrz9" (UID: "c3286e93-02a3-4094-a61d-5b8ba11a35d6") : secret "canary-serving-cert" not found Apr 21 03:57:33.036294 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.036121 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:33.036294 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.036172 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls podName:86dacc78-49b6-4d77-b33d-e5f6f827d63e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:34.036156676 +0000 UTC m=+34.603127161 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls") pod "dns-default-7mb9p" (UID: "86dacc78-49b6-4d77-b33d-e5f6f827d63e") : secret "dns-default-metrics-tls" not found Apr 21 03:57:33.040442 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.040409 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerStarted","Data":"84342dadfeaad4406bcac200affc60d9190552557eed0ab6d57f4451784bf7ce"} Apr 21 03:57:33.640515 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.640471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:33.640700 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.640619 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:33.640700 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.640680 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs podName:88564462-797f-416f-b56b-0e31e0156815 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:05.640665524 +0000 UTC m=+66.207635994 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs") pod "network-metrics-daemon-wcnkn" (UID: "88564462-797f-416f-b56b-0e31e0156815") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:33.741659 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.741624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:33.741790 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.741773 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:33.741829 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.741797 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:33.741829 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.741807 2578 projected.go:194] Error preparing data for projected volume kube-api-access-hxxz5 for pod openshift-network-diagnostics/network-check-target-sql8b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:33.741892 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:33.741859 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5 podName:e498c7e8-3ee2-49ce-8ccf-9a86e869f003 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:05.741841357 +0000 UTC m=+66.308811840 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hxxz5" (UniqueName: "kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5") pod "network-check-target-sql8b" (UID: "e498c7e8-3ee2-49ce-8ccf-9a86e869f003") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:33.872422 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.872385 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:33.872586 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.872463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:57:33.872586 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.872575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:57:33.874934 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.874906 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:57:33.875072 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.874951 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:57:33.875072 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.874968 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:57:33.875480 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.875464 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 03:57:33.875579 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.875563 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tn8dn\"" Apr 21 03:57:33.875693 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:33.875674 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rcf6j\"" Apr 21 03:57:34.044285 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:34.044256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:34.044742 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:34.044332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:34.044742 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:34.044419 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:34.044742 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:34.044464 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:34.044742 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:34.044535 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls podName:86dacc78-49b6-4d77-b33d-e5f6f827d63e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:36.044512351 +0000 UTC m=+36.611482836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls") pod "dns-default-7mb9p" (UID: "86dacc78-49b6-4d77-b33d-e5f6f827d63e") : secret "dns-default-metrics-tls" not found Apr 21 03:57:34.044742 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:34.044531 2578 generic.go:358] "Generic (PLEG): container finished" podID="db466d44-e778-45ee-935c-04ea6c071763" containerID="84342dadfeaad4406bcac200affc60d9190552557eed0ab6d57f4451784bf7ce" exitCode=0 Apr 21 03:57:34.044742 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:34.044571 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert podName:c3286e93-02a3-4094-a61d-5b8ba11a35d6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:36.044548452 +0000 UTC m=+36.611518939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert") pod "ingress-canary-zbrz9" (UID: "c3286e93-02a3-4094-a61d-5b8ba11a35d6") : secret "canary-serving-cert" not found Apr 21 03:57:34.044742 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:34.044570 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerDied","Data":"84342dadfeaad4406bcac200affc60d9190552557eed0ab6d57f4451784bf7ce"} Apr 21 03:57:35.049755 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:35.049723 2578 generic.go:358] "Generic (PLEG): container finished" podID="db466d44-e778-45ee-935c-04ea6c071763" containerID="2b33c0fbbf42fd07dc2f462a26c739e6fbbc064d6125f89cb3e074baecd8a36b" exitCode=0 Apr 21 03:57:35.050111 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:35.049782 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerDied","Data":"2b33c0fbbf42fd07dc2f462a26c739e6fbbc064d6125f89cb3e074baecd8a36b"} Apr 21 03:57:36.054999 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:36.054819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" event={"ID":"db466d44-e778-45ee-935c-04ea6c071763","Type":"ContainerStarted","Data":"76e6fc117ea07ad4574057c098c938e251b9b2d4bd73535595a23de7b6f55ef0"} Apr 21 03:57:36.060750 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:36.060730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:36.060855 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:36.060775 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:36.060908 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:36.060862 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:36.060908 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:36.060876 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:36.060908 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:36.060905 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert podName:c3286e93-02a3-4094-a61d-5b8ba11a35d6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:40.060893239 +0000 UTC m=+40.627863709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert") pod "ingress-canary-zbrz9" (UID: "c3286e93-02a3-4094-a61d-5b8ba11a35d6") : secret "canary-serving-cert" not found Apr 21 03:57:36.061039 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:36.060932 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls podName:86dacc78-49b6-4d77-b33d-e5f6f827d63e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:40.060913771 +0000 UTC m=+40.627884256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls") pod "dns-default-7mb9p" (UID: "86dacc78-49b6-4d77-b33d-e5f6f827d63e") : secret "dns-default-metrics-tls" not found Apr 21 03:57:36.076529 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:36.076490 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8v8nb" podStartSLOduration=5.776050464 podStartE2EDuration="36.076477819s" podCreationTimestamp="2026-04-21 03:57:00 +0000 UTC" firstStartedPulling="2026-04-21 03:57:02.471301265 +0000 UTC m=+3.038271749" lastFinishedPulling="2026-04-21 03:57:32.771728631 +0000 UTC m=+33.338699104" observedRunningTime="2026-04-21 03:57:36.075524447 +0000 UTC m=+36.642494939" watchObservedRunningTime="2026-04-21 03:57:36.076477819 +0000 UTC m=+36.643448311" Apr 21 03:57:39.786579 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:39.786526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:39.789996 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:39.789964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ba63635-96d8-482b-a6ee-d309369ecee1-original-pull-secret\") pod \"global-pull-secret-syncer-lqtd7\" (UID: \"8ba63635-96d8-482b-a6ee-d309369ecee1\") " pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:39.881717 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:39.881690 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lqtd7" Apr 21 03:57:40.005876 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:40.005844 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lqtd7"] Apr 21 03:57:40.009196 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:40.009168 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ba63635_96d8_482b_a6ee_d309369ecee1.slice/crio-b27c0946205e1eebd23559bcafd1e8eb18c358f29fdba12e3ad0de25c27141ed WatchSource:0}: Error finding container b27c0946205e1eebd23559bcafd1e8eb18c358f29fdba12e3ad0de25c27141ed: Status 404 returned error can't find the container with id b27c0946205e1eebd23559bcafd1e8eb18c358f29fdba12e3ad0de25c27141ed Apr 21 03:57:40.063621 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:40.063537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lqtd7" event={"ID":"8ba63635-96d8-482b-a6ee-d309369ecee1","Type":"ContainerStarted","Data":"b27c0946205e1eebd23559bcafd1e8eb18c358f29fdba12e3ad0de25c27141ed"} Apr 21 03:57:40.088938 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:40.088910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:40.089027 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:40.088975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:40.089086 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:40.089065 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:40.089136 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:40.089083 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:40.089136 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:40.089114 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls podName:86dacc78-49b6-4d77-b33d-e5f6f827d63e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:48.089101335 +0000 UTC m=+48.656071804 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls") pod "dns-default-7mb9p" (UID: "86dacc78-49b6-4d77-b33d-e5f6f827d63e") : secret "dns-default-metrics-tls" not found Apr 21 03:57:40.089221 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:40.089158 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert podName:c3286e93-02a3-4094-a61d-5b8ba11a35d6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:48.089138336 +0000 UTC m=+48.656108809 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert") pod "ingress-canary-zbrz9" (UID: "c3286e93-02a3-4094-a61d-5b8ba11a35d6") : secret "canary-serving-cert" not found Apr 21 03:57:44.076263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:44.076224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lqtd7" event={"ID":"8ba63635-96d8-482b-a6ee-d309369ecee1","Type":"ContainerStarted","Data":"afacf3d60bb9bf1221ef2ad35c094cdf955eced4c877a3cba41aeb5c6bb7d939"} Apr 21 03:57:48.153147 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:48.153109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:57:48.153621 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:48.153166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:57:48.153621 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:48.153269 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:48.153621 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:48.153354 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls podName:86dacc78-49b6-4d77-b33d-e5f6f827d63e nodeName:}" failed. No retries permitted until 2026-04-21 03:58:04.153338103 +0000 UTC m=+64.720308577 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls") pod "dns-default-7mb9p" (UID: "86dacc78-49b6-4d77-b33d-e5f6f827d63e") : secret "dns-default-metrics-tls" not found Apr 21 03:57:48.153621 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:48.153269 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:48.153621 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:48.153403 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert podName:c3286e93-02a3-4094-a61d-5b8ba11a35d6 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:04.153390887 +0000 UTC m=+64.720361357 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert") pod "ingress-canary-zbrz9" (UID: "c3286e93-02a3-4094-a61d-5b8ba11a35d6") : secret "canary-serving-cert" not found Apr 21 03:57:56.341255 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.341205 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lqtd7" podStartSLOduration=29.91970758 podStartE2EDuration="33.341188954s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:40.010955199 +0000 UTC m=+40.577925668" lastFinishedPulling="2026-04-21 03:57:43.432436564 +0000 UTC m=+43.999407042" observedRunningTime="2026-04-21 03:57:44.090896418 +0000 UTC m=+44.657866909" watchObservedRunningTime="2026-04-21 03:57:56.341188954 +0000 UTC m=+56.908159445" Apr 21 03:57:56.341952 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.341931 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f"] Apr 21 03:57:56.346465 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.346441 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5vjmh"] Apr 21 03:57:56.346675 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.346651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f" Apr 21 03:57:56.350429 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.350204 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.350954 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.350906 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-gqsng\"" Apr 21 03:57:56.351355 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.351305 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7"] Apr 21 03:57:56.351452 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.351425 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.352224 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.352205 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.354496 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.354479 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 03:57:56.354610 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.354593 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.355009 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.354988 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.355101 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.355059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-68k6s\"" Apr 21 03:57:56.355162 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.355141 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.355241 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.355220 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 03:57:56.356849 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.356829 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 03:57:56.357168 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.357155 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.357695 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.357677 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9dm5k\"" Apr 21 03:57:56.357793 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.357779 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.358759 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.358730 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 03:57:56.359515 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.359494 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f"] Apr 21 03:57:56.362666 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.362642 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7"] Apr 21 03:57:56.364491 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.364472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 03:57:56.367128 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.367109 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5vjmh"] Apr 21 03:57:56.402963 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.402929 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb"] Apr 21 03:57:56.406200 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.406183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:57:56.408065 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zzxs\" (UniqueName: \"kubernetes.io/projected/f065d82f-468a-4343-b62c-5c000b2c9ad2-kube-api-access-9zzxs\") pod \"volume-data-source-validator-7c6cbb6c87-zl47f\" (UID: \"f065d82f-468a-4343-b62c-5c000b2c9ad2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f" Apr 21 03:57:56.408178 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-config\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.408178 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/53593715-fad9-4f5d-8bfa-5579ca4bfd14-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.408178 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408136 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.408379 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:57:56.408379 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-serving-cert\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.408379 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldq6w\" (UniqueName: \"kubernetes.io/projected/53593715-fad9-4f5d-8bfa-5579ca4bfd14-kube-api-access-ldq6w\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.408379 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2846597e-4516-4d1d-9e48-8d7c984b548c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:57:56.408379 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408327 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5tz\" (UniqueName: \"kubernetes.io/projected/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-kube-api-access-cj5tz\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.408379 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408354 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-trusted-ca\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.408696 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408680 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-nws68\"" Apr 21 03:57:56.408737 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.408703 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 03:57:56.409250 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.409237 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 03:57:56.410771 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.410737 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5w6b"] Apr 21 03:57:56.413898 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.413881 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r"] Apr 21 03:57:56.414047 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.414032 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.417226 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.417205 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.417351 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.417232 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.417351 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.417248 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8655966fcf-5f4kf"] Apr 21 03:57:56.417505 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.417480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.417505 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.417496 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 03:57:56.417925 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.417869 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bbz6s\"" Apr 21 03:57:56.418395 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.418375 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 03:57:56.419759 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.419713 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.419945 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.419924 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.420071 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.420054 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 03:57:56.420144 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.420127 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-9hbqn\"" Apr 21 03:57:56.420531 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.420514 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 03:57:56.420893 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.420839 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb"] Apr 21 03:57:56.421008 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.420977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.426174 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.426152 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 03:57:56.426927 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.426907 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 03:57:56.427045 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.426928 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 03:57:56.427045 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.427034 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ntv6f\"" Apr 21 03:57:56.427390 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.427373 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 03:57:56.429265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.429248 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r"] Apr 21 03:57:56.432001 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.431982 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 03:57:56.439957 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.439934 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8655966fcf-5f4kf"] Apr 21 03:57:56.440679 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.440656 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5w6b"] Apr 21 03:57:56.508636 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b31507b9-91ed-4a27-ad54-be88b2865602-tmp\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.508636 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdaaadc4-2dd1-4be8-955c-755deb5df200-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.508883 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-config\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.508883 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/53593715-fad9-4f5d-8bfa-5579ca4bfd14-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.508883 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-image-registry-private-configuration\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.508883 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-certificates\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.508883 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:57:56.508883 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mlv6\" (UniqueName: \"kubernetes.io/projected/b31507b9-91ed-4a27-ad54-be88b2865602-kube-api-access-8mlv6\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.508883 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b31507b9-91ed-4a27-ad54-be88b2865602-snapshots\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.508883 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508864 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-trusted-ca\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2846597e-4516-4d1d-9e48-8d7c984b548c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:56.508909 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.508915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q72zx\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-kube-api-access-q72zx\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:56.508986 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert podName:2846597e-4516-4d1d-9e48-8d7c984b548c nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.008960738 +0000 UTC m=+57.575931227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dg8bb" (UID: "2846597e-4516-4d1d-9e48-8d7c984b548c") : secret "networking-console-plugin-cert" not found Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509027 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b31507b9-91ed-4a27-ad54-be88b2865602-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509057 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-installation-pull-secrets\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmn7s\" (UniqueName: \"kubernetes.io/projected/bdaaadc4-2dd1-4be8-955c-755deb5df200-kube-api-access-nmn7s\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-trusted-ca\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509174 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-bound-sa-token\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509242 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdaaadc4-2dd1-4be8-955c-755deb5df200-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.509265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldq6w\" (UniqueName: \"kubernetes.io/projected/53593715-fad9-4f5d-8bfa-5579ca4bfd14-kube-api-access-ldq6w\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:56.509285 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509303 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zzxs\" (UniqueName: \"kubernetes.io/projected/f065d82f-468a-4343-b62c-5c000b2c9ad2-kube-api-access-9zzxs\") pod \"volume-data-source-validator-7c6cbb6c87-zl47f\" (UID: \"f065d82f-468a-4343-b62c-5c000b2c9ad2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:56.509366 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls podName:53593715-fad9-4f5d-8bfa-5579ca4bfd14 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.009349883 +0000 UTC m=+57.576320357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rf7f7" (UID: "53593715-fad9-4f5d-8bfa-5579ca4bfd14") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-serving-cert\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-ca-trust-extracted\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509457 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-config\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509483 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b31507b9-91ed-4a27-ad54-be88b2865602-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b31507b9-91ed-4a27-ad54-be88b2865602-serving-cert\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/53593715-fad9-4f5d-8bfa-5579ca4bfd14-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5tz\" (UniqueName: \"kubernetes.io/projected/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-kube-api-access-cj5tz\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.509805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.509641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2846597e-4516-4d1d-9e48-8d7c984b548c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:57:56.510172 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.510064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-trusted-ca\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.511754 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.511737 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-serving-cert\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.524659 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.524631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zzxs\" (UniqueName: \"kubernetes.io/projected/f065d82f-468a-4343-b62c-5c000b2c9ad2-kube-api-access-9zzxs\") pod \"volume-data-source-validator-7c6cbb6c87-zl47f\" (UID: \"f065d82f-468a-4343-b62c-5c000b2c9ad2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f" Apr 21 03:57:56.525746 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.525722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldq6w\" (UniqueName: \"kubernetes.io/projected/53593715-fad9-4f5d-8bfa-5579ca4bfd14-kube-api-access-ldq6w\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:56.525941 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.525926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5tz\" (UniqueName: \"kubernetes.io/projected/d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69-kube-api-access-cj5tz\") pod \"console-operator-9d4b6777b-5vjmh\" (UID: \"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69\") " pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.610560 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.610468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-image-registry-private-configuration\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.610560 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.610508 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-certificates\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.610739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.610573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mlv6\" (UniqueName: \"kubernetes.io/projected/b31507b9-91ed-4a27-ad54-be88b2865602-kube-api-access-8mlv6\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.610739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.610602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b31507b9-91ed-4a27-ad54-be88b2865602-snapshots\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.610739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.610653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-trusted-ca\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.610739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.610683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q72zx\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-kube-api-access-q72zx\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.610739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.610707 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b31507b9-91ed-4a27-ad54-be88b2865602-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.610739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.610733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-installation-pull-secrets\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.611094 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmn7s\" (UniqueName: \"kubernetes.io/projected/bdaaadc4-2dd1-4be8-955c-755deb5df200-kube-api-access-nmn7s\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.611165 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-bound-sa-token\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.611220 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdaaadc4-2dd1-4be8-955c-755deb5df200-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.611410 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-ca-trust-extracted\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.611524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.611524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b31507b9-91ed-4a27-ad54-be88b2865602-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.611524 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611483 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b31507b9-91ed-4a27-ad54-be88b2865602-serving-cert\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.611670 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b31507b9-91ed-4a27-ad54-be88b2865602-tmp\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.611670 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdaaadc4-2dd1-4be8-955c-755deb5df200-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.611670 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:56.611582 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:56.611670 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:56.611600 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8655966fcf-5f4kf: secret "image-registry-tls" not found Apr 21 03:57:56.611670 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:56.611665 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls podName:2ed93b1d-890e-40f1-86fc-3b6445fb57ec nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.111646506 +0000 UTC m=+57.678616995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls") pod "image-registry-8655966fcf-5f4kf" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec") : secret "image-registry-tls" not found Apr 21 03:57:56.611670 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611666 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b31507b9-91ed-4a27-ad54-be88b2865602-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.611968 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdaaadc4-2dd1-4be8-955c-755deb5df200-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.611968 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.611945 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-certificates\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.612153 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.612130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b31507b9-91ed-4a27-ad54-be88b2865602-snapshots\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.612271 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.612145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-ca-trust-extracted\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.612382 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.612348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b31507b9-91ed-4a27-ad54-be88b2865602-tmp\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.612382 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.612162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-trusted-ca\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.612575 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.612415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b31507b9-91ed-4a27-ad54-be88b2865602-service-ca-bundle\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.613531 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.613508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-image-registry-private-configuration\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.613621 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.613561 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-installation-pull-secrets\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.613806 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.613788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdaaadc4-2dd1-4be8-955c-755deb5df200-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.614305 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.614288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b31507b9-91ed-4a27-ad54-be88b2865602-serving-cert\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.618755 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.618720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q72zx\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-kube-api-access-q72zx\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.620365 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.620340 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmn7s\" (UniqueName: \"kubernetes.io/projected/bdaaadc4-2dd1-4be8-955c-755deb5df200-kube-api-access-nmn7s\") pod \"kube-storage-version-migrator-operator-6769c5d45-2nv4r\" (UID: \"bdaaadc4-2dd1-4be8-955c-755deb5df200\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.620687 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.620660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-bound-sa-token\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:56.621046 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.621026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mlv6\" (UniqueName: \"kubernetes.io/projected/b31507b9-91ed-4a27-ad54-be88b2865602-kube-api-access-8mlv6\") pod \"insights-operator-585dfdc468-s5w6b\" (UID: \"b31507b9-91ed-4a27-ad54-be88b2865602\") " pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.659052 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.659005 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f" Apr 21 03:57:56.667775 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.667757 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:57:56.725714 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.725458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-s5w6b" Apr 21 03:57:56.736385 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.735958 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" Apr 21 03:57:56.785027 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.784995 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f"] Apr 21 03:57:56.789964 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:56.789847 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf065d82f_468a_4343_b62c_5c000b2c9ad2.slice/crio-eb8de0b836dbd589546d9a772b80a6754c138d4f005ca5d156c8fe7e39c44919 WatchSource:0}: Error finding container eb8de0b836dbd589546d9a772b80a6754c138d4f005ca5d156c8fe7e39c44919: Status 404 returned error can't find the container with id eb8de0b836dbd589546d9a772b80a6754c138d4f005ca5d156c8fe7e39c44919 Apr 21 03:57:56.807451 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.807254 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5vjmh"] Apr 21 03:57:56.811971 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:56.811940 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd67a7c8e_4570_49e2_a6b2_fb6ceeba5a69.slice/crio-34aab062787426c5e5488aad18557a77d0e84fa604c2e178ff0db4ed0388f043 WatchSource:0}: Error finding container 34aab062787426c5e5488aad18557a77d0e84fa604c2e178ff0db4ed0388f043: Status 404 returned error can't find the container with id 34aab062787426c5e5488aad18557a77d0e84fa604c2e178ff0db4ed0388f043 Apr 21 03:57:56.868880 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.868807 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-s5w6b"] Apr 21 03:57:56.871578 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:56.871551 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31507b9_91ed_4a27_ad54_be88b2865602.slice/crio-7ac56d44ee9357fd26462ea8fa56bbdeb5d65515605077b2f07ed7503d5d8741 WatchSource:0}: Error finding container 7ac56d44ee9357fd26462ea8fa56bbdeb5d65515605077b2f07ed7503d5d8741: Status 404 returned error can't find the container with id 7ac56d44ee9357fd26462ea8fa56bbdeb5d65515605077b2f07ed7503d5d8741 Apr 21 03:57:56.883603 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:56.883576 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r"] Apr 21 03:57:56.886636 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:57:56.886608 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdaaadc4_2dd1_4be8_955c_755deb5df200.slice/crio-c9eadc05305d66aa53c752fc73f36df1398e1a64b19ca06e2863bf82ca3352cf WatchSource:0}: Error finding container c9eadc05305d66aa53c752fc73f36df1398e1a64b19ca06e2863bf82ca3352cf: Status 404 returned error can't find the container with id c9eadc05305d66aa53c752fc73f36df1398e1a64b19ca06e2863bf82ca3352cf Apr 21 03:57:57.015142 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:57.015108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:57:57.015306 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:57.015170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:57.015306 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:57.015265 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:57:57.015409 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:57.015350 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert podName:2846597e-4516-4d1d-9e48-8d7c984b548c nodeName:}" failed. No retries permitted until 2026-04-21 03:57:58.015333313 +0000 UTC m=+58.582303788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dg8bb" (UID: "2846597e-4516-4d1d-9e48-8d7c984b548c") : secret "networking-console-plugin-cert" not found Apr 21 03:57:57.015409 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:57.015271 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:57:57.015475 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:57.015416 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls podName:53593715-fad9-4f5d-8bfa-5579ca4bfd14 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:58.015403474 +0000 UTC m=+58.582373948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rf7f7" (UID: "53593715-fad9-4f5d-8bfa-5579ca4bfd14") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:57:57.100364 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:57.100328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5w6b" event={"ID":"b31507b9-91ed-4a27-ad54-be88b2865602","Type":"ContainerStarted","Data":"7ac56d44ee9357fd26462ea8fa56bbdeb5d65515605077b2f07ed7503d5d8741"} Apr 21 03:57:57.101278 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:57.101254 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" event={"ID":"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69","Type":"ContainerStarted","Data":"34aab062787426c5e5488aad18557a77d0e84fa604c2e178ff0db4ed0388f043"} Apr 21 03:57:57.102160 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:57.102141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f" event={"ID":"f065d82f-468a-4343-b62c-5c000b2c9ad2","Type":"ContainerStarted","Data":"eb8de0b836dbd589546d9a772b80a6754c138d4f005ca5d156c8fe7e39c44919"} Apr 21 03:57:57.103038 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:57.103019 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" event={"ID":"bdaaadc4-2dd1-4be8-955c-755deb5df200","Type":"ContainerStarted","Data":"c9eadc05305d66aa53c752fc73f36df1398e1a64b19ca06e2863bf82ca3352cf"} Apr 21 03:57:57.116614 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:57.116591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:57.116729 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:57.116718 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:57.116770 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:57.116731 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8655966fcf-5f4kf: secret "image-registry-tls" not found Apr 21 03:57:57.116802 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:57.116780 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls podName:2ed93b1d-890e-40f1-86fc-3b6445fb57ec nodeName:}" failed. No retries permitted until 2026-04-21 03:57:58.116763343 +0000 UTC m=+58.683733826 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls") pod "image-registry-8655966fcf-5f4kf" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec") : secret "image-registry-tls" not found Apr 21 03:57:58.026432 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:58.026379 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:57:58.026922 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:58.026492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:57:58.026922 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:58.026704 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:57:58.026922 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:58.026786 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls podName:53593715-fad9-4f5d-8bfa-5579ca4bfd14 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.026765564 +0000 UTC m=+60.593736039 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rf7f7" (UID: "53593715-fad9-4f5d-8bfa-5579ca4bfd14") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:57:58.027102 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:58.027080 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:57:58.027171 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:58.027164 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert podName:2846597e-4516-4d1d-9e48-8d7c984b548c nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.027143247 +0000 UTC m=+60.594113745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dg8bb" (UID: "2846597e-4516-4d1d-9e48-8d7c984b548c") : secret "networking-console-plugin-cert" not found Apr 21 03:57:58.043275 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:58.042891 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ssvt" Apr 21 03:57:58.127385 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:58.127346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:57:58.127584 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:58.127566 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:58.127584 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:58.127582 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8655966fcf-5f4kf: secret "image-registry-tls" not found Apr 21 03:57:58.127686 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:57:58.127650 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls podName:2ed93b1d-890e-40f1-86fc-3b6445fb57ec nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.127631394 +0000 UTC m=+60.694601867 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls") pod "image-registry-8655966fcf-5f4kf" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec") : secret "image-registry-tls" not found Apr 21 03:57:59.108543 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:59.108509 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f" event={"ID":"f065d82f-468a-4343-b62c-5c000b2c9ad2","Type":"ContainerStarted","Data":"372c177a6c07c4d009edbb05949950bcf4ec6372ae39dbd4d3e4d9c8602213fd"} Apr 21 03:57:59.122846 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:57:59.122786 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zl47f" podStartSLOduration=1.627409855 podStartE2EDuration="3.122771613s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="2026-04-21 03:57:56.792729289 +0000 UTC m=+57.359699760" lastFinishedPulling="2026-04-21 03:57:58.288091031 +0000 UTC m=+58.855061518" observedRunningTime="2026-04-21 03:57:59.121831221 +0000 UTC m=+59.688801712" watchObservedRunningTime="2026-04-21 03:57:59.122771613 +0000 UTC m=+59.689742083" Apr 21 03:58:00.044997 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.044961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:58:00.045169 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.045036 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:58:00.045169 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:00.045131 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:58:00.045265 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:00.045189 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls podName:53593715-fad9-4f5d-8bfa-5579ca4bfd14 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:04.045175387 +0000 UTC m=+64.612145873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rf7f7" (UID: "53593715-fad9-4f5d-8bfa-5579ca4bfd14") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:58:00.045265 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:00.045131 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:58:00.045381 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:00.045288 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert podName:2846597e-4516-4d1d-9e48-8d7c984b548c nodeName:}" failed. No retries permitted until 2026-04-21 03:58:04.045264615 +0000 UTC m=+64.612235105 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dg8bb" (UID: "2846597e-4516-4d1d-9e48-8d7c984b548c") : secret "networking-console-plugin-cert" not found Apr 21 03:58:00.113026 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.112939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" event={"ID":"bdaaadc4-2dd1-4be8-955c-755deb5df200","Type":"ContainerStarted","Data":"74e72e906f5ed1f52c808ffc8aa7c996934191c7234ace875e8b97fe9336308f"} Apr 21 03:58:00.114489 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.114460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5w6b" event={"ID":"b31507b9-91ed-4a27-ad54-be88b2865602","Type":"ContainerStarted","Data":"225dbaff54241d6c625ab4f75702092d055fdde90cdb0a83635c6054542d8709"} Apr 21 03:58:00.116029 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.116009 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/0.log" Apr 21 03:58:00.116130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.116043 2578 generic.go:358] "Generic (PLEG): container finished" podID="d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69" containerID="ba287aed344146f8d3426c920d5a67eec13c3ff1844ad73fd773ebce77149a17" exitCode=255 Apr 21 03:58:00.116130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.116116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" event={"ID":"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69","Type":"ContainerDied","Data":"ba287aed344146f8d3426c920d5a67eec13c3ff1844ad73fd773ebce77149a17"} Apr 21 03:58:00.116355 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.116338 2578 scope.go:117] "RemoveContainer" containerID="ba287aed344146f8d3426c920d5a67eec13c3ff1844ad73fd773ebce77149a17" Apr 21 03:58:00.129857 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.129807 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" podStartSLOduration=1.143177447 podStartE2EDuration="4.129789451s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="2026-04-21 03:57:56.888800293 +0000 UTC m=+57.455770767" lastFinishedPulling="2026-04-21 03:57:59.87541229 +0000 UTC m=+60.442382771" observedRunningTime="2026-04-21 03:58:00.128463208 +0000 UTC m=+60.695433702" watchObservedRunningTime="2026-04-21 03:58:00.129789451 +0000 UTC m=+60.696759944" Apr 21 03:58:00.146611 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.145804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:00.147274 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:00.146985 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:58:00.147274 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:00.147008 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8655966fcf-5f4kf: secret "image-registry-tls" not found Apr 21 03:58:00.147274 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:00.147080 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls podName:2ed93b1d-890e-40f1-86fc-3b6445fb57ec nodeName:}" failed. No retries permitted until 2026-04-21 03:58:04.14705409 +0000 UTC m=+64.714024574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls") pod "image-registry-8655966fcf-5f4kf" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec") : secret "image-registry-tls" not found Apr 21 03:58:00.153640 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:00.153590 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-s5w6b" podStartSLOduration=1.157396955 podStartE2EDuration="4.153573305s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="2026-04-21 03:57:56.873282945 +0000 UTC m=+57.440253416" lastFinishedPulling="2026-04-21 03:57:59.869459296 +0000 UTC m=+60.436429766" observedRunningTime="2026-04-21 03:58:00.147758428 +0000 UTC m=+60.714728919" watchObservedRunningTime="2026-04-21 03:58:00.153573305 +0000 UTC m=+60.720543798" Apr 21 03:58:01.119702 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:01.119673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/1.log" Apr 21 03:58:01.120145 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:01.120075 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/0.log" Apr 21 03:58:01.120145 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:01.120107 2578 generic.go:358] "Generic (PLEG): container finished" podID="d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69" containerID="01a1f73b969529cdba0dcbdf35f3bb22fd2db6a80ef23cfa624e07f9670a0225" exitCode=255 Apr 21 03:58:01.120252 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:01.120182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" event={"ID":"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69","Type":"ContainerDied","Data":"01a1f73b969529cdba0dcbdf35f3bb22fd2db6a80ef23cfa624e07f9670a0225"} Apr 21 03:58:01.120252 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:01.120226 2578 scope.go:117] "RemoveContainer" containerID="ba287aed344146f8d3426c920d5a67eec13c3ff1844ad73fd773ebce77149a17" Apr 21 03:58:01.120478 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:01.120461 2578 scope.go:117] "RemoveContainer" containerID="01a1f73b969529cdba0dcbdf35f3bb22fd2db6a80ef23cfa624e07f9670a0225" Apr 21 03:58:01.120688 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:01.120670 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5vjmh_openshift-console-operator(d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" podUID="d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69" Apr 21 03:58:02.123925 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:02.123897 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/1.log" Apr 21 03:58:02.124301 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:02.124214 2578 scope.go:117] "RemoveContainer" containerID="01a1f73b969529cdba0dcbdf35f3bb22fd2db6a80ef23cfa624e07f9670a0225" Apr 21 03:58:02.124421 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:02.124402 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5vjmh_openshift-console-operator(d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" podUID="d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69" Apr 21 03:58:02.734301 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:02.734272 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m7pww_48171444-16ad-44d3-adcd-dbc651bb6b7e/dns-node-resolver/0.log" Apr 21 03:58:03.534828 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:03.534799 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9j76c_4b9475f5-b0ae-474d-b51c-d5a1efe76899/node-ca/0.log" Apr 21 03:58:04.076609 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:04.076571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:58:04.076818 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:04.076654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:58:04.076818 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.076732 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:58:04.076818 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.076735 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:58:04.076818 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.076791 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert podName:2846597e-4516-4d1d-9e48-8d7c984b548c nodeName:}" failed. No retries permitted until 2026-04-21 03:58:12.076777972 +0000 UTC m=+72.643748446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dg8bb" (UID: "2846597e-4516-4d1d-9e48-8d7c984b548c") : secret "networking-console-plugin-cert" not found Apr 21 03:58:04.076818 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.076815 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls podName:53593715-fad9-4f5d-8bfa-5579ca4bfd14 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:12.076796923 +0000 UTC m=+72.643767409 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rf7f7" (UID: "53593715-fad9-4f5d-8bfa-5579ca4bfd14") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:58:04.177143 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:04.177098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:58:04.177362 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:04.177227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:04.177362 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.177242 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:04.177362 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:04.177257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:58:04.177362 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.177299 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls podName:86dacc78-49b6-4d77-b33d-e5f6f827d63e nodeName:}" failed. No retries permitted until 2026-04-21 03:58:36.177283986 +0000 UTC m=+96.744254469 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls") pod "dns-default-7mb9p" (UID: "86dacc78-49b6-4d77-b33d-e5f6f827d63e") : secret "dns-default-metrics-tls" not found Apr 21 03:58:04.177582 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.177372 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:04.177582 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.177436 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert podName:c3286e93-02a3-4094-a61d-5b8ba11a35d6 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:36.177420558 +0000 UTC m=+96.744391046 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert") pod "ingress-canary-zbrz9" (UID: "c3286e93-02a3-4094-a61d-5b8ba11a35d6") : secret "canary-serving-cert" not found Apr 21 03:58:04.177582 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.177374 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:58:04.177582 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.177458 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8655966fcf-5f4kf: secret "image-registry-tls" not found Apr 21 03:58:04.177582 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:04.177493 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls podName:2ed93b1d-890e-40f1-86fc-3b6445fb57ec nodeName:}" failed. No retries permitted until 2026-04-21 03:58:12.177481395 +0000 UTC m=+72.744451870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls") pod "image-registry-8655966fcf-5f4kf" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec") : secret "image-registry-tls" not found Apr 21 03:58:05.689954 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:05.689912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:58:05.691902 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:05.691878 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:58:05.700659 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:05.700634 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:58:05.700718 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:05.700711 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs podName:88564462-797f-416f-b56b-0e31e0156815 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:09.700690849 +0000 UTC m=+130.267661337 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs") pod "network-metrics-daemon-wcnkn" (UID: "88564462-797f-416f-b56b-0e31e0156815") : secret "metrics-daemon-secret" not found Apr 21 03:58:05.790931 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:05.790893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:58:05.793437 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:05.793419 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:58:05.803530 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:05.803514 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:58:05.814325 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:05.814284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxz5\" (UniqueName: \"kubernetes.io/projected/e498c7e8-3ee2-49ce-8ccf-9a86e869f003-kube-api-access-hxxz5\") pod \"network-check-target-sql8b\" (UID: \"e498c7e8-3ee2-49ce-8ccf-9a86e869f003\") " pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:58:05.989993 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:05.989905 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tn8dn\"" Apr 21 03:58:05.998649 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:05.998611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:58:06.110644 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:06.110608 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sql8b"] Apr 21 03:58:06.114236 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:06.114205 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode498c7e8_3ee2_49ce_8ccf_9a86e869f003.slice/crio-c287a73e88d5486653af6f1490b92980da9d455dd8fef45887ad8a5572d0e612 WatchSource:0}: Error finding container c287a73e88d5486653af6f1490b92980da9d455dd8fef45887ad8a5572d0e612: Status 404 returned error can't find the container with id c287a73e88d5486653af6f1490b92980da9d455dd8fef45887ad8a5572d0e612 Apr 21 03:58:06.136520 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:06.136483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sql8b" event={"ID":"e498c7e8-3ee2-49ce-8ccf-9a86e869f003","Type":"ContainerStarted","Data":"c287a73e88d5486653af6f1490b92980da9d455dd8fef45887ad8a5572d0e612"} Apr 21 03:58:06.668727 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:06.668683 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:58:06.668727 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:06.668734 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:58:06.669192 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:06.669174 2578 scope.go:117] "RemoveContainer" containerID="01a1f73b969529cdba0dcbdf35f3bb22fd2db6a80ef23cfa624e07f9670a0225" Apr 21 03:58:06.669414 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:06.669385 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5vjmh_openshift-console-operator(d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" podUID="d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69" Apr 21 03:58:09.151397 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:09.151318 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sql8b" event={"ID":"e498c7e8-3ee2-49ce-8ccf-9a86e869f003","Type":"ContainerStarted","Data":"4a62b9cfbc5d987e8e5b68b978e209eccafff30d36ef9f719b495c0880b2bfca"} Apr 21 03:58:09.151739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:09.151440 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:58:09.166717 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:09.166611 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sql8b" podStartSLOduration=66.537531801 podStartE2EDuration="1m9.166596017s" podCreationTimestamp="2026-04-21 03:57:00 +0000 UTC" firstStartedPulling="2026-04-21 03:58:06.1159125 +0000 UTC m=+66.682882969" lastFinishedPulling="2026-04-21 03:58:08.744976707 +0000 UTC m=+69.311947185" observedRunningTime="2026-04-21 03:58:09.165782437 +0000 UTC m=+69.732752941" watchObservedRunningTime="2026-04-21 03:58:09.166596017 +0000 UTC m=+69.733566510" Apr 21 03:58:12.149815 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:12.149761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:58:12.150184 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:12.149847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:58:12.150184 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:12.149912 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:58:12.150184 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:12.149966 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:58:12.150184 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:12.149977 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert podName:2846597e-4516-4d1d-9e48-8d7c984b548c nodeName:}" failed. No retries permitted until 2026-04-21 03:58:28.149960887 +0000 UTC m=+88.716931362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dg8bb" (UID: "2846597e-4516-4d1d-9e48-8d7c984b548c") : secret "networking-console-plugin-cert" not found Apr 21 03:58:12.150184 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:12.150008 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls podName:53593715-fad9-4f5d-8bfa-5579ca4bfd14 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:28.149996701 +0000 UTC m=+88.716967171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rf7f7" (UID: "53593715-fad9-4f5d-8bfa-5579ca4bfd14") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:58:12.250931 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:12.250893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:12.253223 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:12.253197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"image-registry-8655966fcf-5f4kf\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:12.340844 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:12.340813 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ntv6f\"" Apr 21 03:58:12.349573 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:12.349554 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:12.466694 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:12.466654 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8655966fcf-5f4kf"] Apr 21 03:58:12.470924 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:12.470896 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed93b1d_890e_40f1_86fc_3b6445fb57ec.slice/crio-a1eae0a9895cd1c3a6ff0e8648a1067add4426c6c433ac7426b217e1f66586df WatchSource:0}: Error finding container a1eae0a9895cd1c3a6ff0e8648a1067add4426c6c433ac7426b217e1f66586df: Status 404 returned error can't find the container with id a1eae0a9895cd1c3a6ff0e8648a1067add4426c6c433ac7426b217e1f66586df Apr 21 03:58:13.160957 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:13.160922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" event={"ID":"2ed93b1d-890e-40f1-86fc-3b6445fb57ec","Type":"ContainerStarted","Data":"5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0"} Apr 21 03:58:13.160957 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:13.160959 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" event={"ID":"2ed93b1d-890e-40f1-86fc-3b6445fb57ec","Type":"ContainerStarted","Data":"a1eae0a9895cd1c3a6ff0e8648a1067add4426c6c433ac7426b217e1f66586df"} Apr 21 03:58:13.161402 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:13.161203 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:13.181237 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:13.181188 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" podStartSLOduration=17.1811743 podStartE2EDuration="17.1811743s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:58:13.18052076 +0000 UTC m=+73.747491253" watchObservedRunningTime="2026-04-21 03:58:13.1811743 +0000 UTC m=+73.748144807" Apr 21 03:58:17.875291 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:17.875263 2578 scope.go:117] "RemoveContainer" containerID="01a1f73b969529cdba0dcbdf35f3bb22fd2db6a80ef23cfa624e07f9670a0225" Apr 21 03:58:18.176086 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:18.176060 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 03:58:18.176439 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:18.176423 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/1.log" Apr 21 03:58:18.176491 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:18.176457 2578 generic.go:358] "Generic (PLEG): container finished" podID="d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69" containerID="b7401567757016312f98821e3a1a7bb4b5930bb9c14c877138d0a87a0d0e60f7" exitCode=255 Apr 21 03:58:18.176544 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:18.176511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" event={"ID":"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69","Type":"ContainerDied","Data":"b7401567757016312f98821e3a1a7bb4b5930bb9c14c877138d0a87a0d0e60f7"} Apr 21 03:58:18.176582 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:18.176543 2578 scope.go:117] "RemoveContainer" containerID="01a1f73b969529cdba0dcbdf35f3bb22fd2db6a80ef23cfa624e07f9670a0225" Apr 21 03:58:18.176903 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:18.176887 2578 scope.go:117] "RemoveContainer" containerID="b7401567757016312f98821e3a1a7bb4b5930bb9c14c877138d0a87a0d0e60f7" Apr 21 03:58:18.177106 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:18.177087 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-5vjmh_openshift-console-operator(d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" podUID="d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69" Apr 21 03:58:19.182901 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:19.182873 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 03:58:23.612100 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.612070 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4sndf"] Apr 21 03:58:23.616890 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.616866 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.620116 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.620086 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 03:58:23.620116 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.620105 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 03:58:23.620276 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.620142 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qz42s\"" Apr 21 03:58:23.629460 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.629439 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8655966fcf-5f4kf"] Apr 21 03:58:23.634936 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.634914 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4sndf"] Apr 21 03:58:23.680505 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.680477 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67d8c4d679-5fkvs"] Apr 21 03:58:23.683300 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.683282 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.698429 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.698404 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67d8c4d679-5fkvs"] Apr 21 03:58:23.744931 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.744899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-284pz\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-kube-api-access-284pz\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.745102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.744958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82nw8\" (UniqueName: \"kubernetes.io/projected/357eea12-bfee-4163-a496-c41dc3e15906-kube-api-access-82nw8\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.745102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.744999 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac968620-3704-45c1-9775-d96d60659cc1-ca-trust-extracted\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.745102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac968620-3704-45c1-9775-d96d60659cc1-trusted-ca\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.745102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745040 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-bound-sa-token\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.745102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/357eea12-bfee-4163-a496-c41dc3e15906-data-volume\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.745455 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/357eea12-bfee-4163-a496-c41dc3e15906-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.745455 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745183 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac968620-3704-45c1-9775-d96d60659cc1-registry-certificates\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.745455 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac968620-3704-45c1-9775-d96d60659cc1-image-registry-private-configuration\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.745455 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745269 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-registry-tls\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.745455 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745298 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/357eea12-bfee-4163-a496-c41dc3e15906-crio-socket\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.745455 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/357eea12-bfee-4163-a496-c41dc3e15906-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.745455 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.745433 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac968620-3704-45c1-9775-d96d60659cc1-installation-pull-secrets\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.846526 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac968620-3704-45c1-9775-d96d60659cc1-image-registry-private-configuration\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.846735 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-registry-tls\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.846735 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/357eea12-bfee-4163-a496-c41dc3e15906-crio-socket\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.846735 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/357eea12-bfee-4163-a496-c41dc3e15906-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.846735 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846666 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac968620-3704-45c1-9775-d96d60659cc1-installation-pull-secrets\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.846735 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-284pz\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-kube-api-access-284pz\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.846995 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/357eea12-bfee-4163-a496-c41dc3e15906-crio-socket\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.846995 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82nw8\" (UniqueName: \"kubernetes.io/projected/357eea12-bfee-4163-a496-c41dc3e15906-kube-api-access-82nw8\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.846995 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac968620-3704-45c1-9775-d96d60659cc1-ca-trust-extracted\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.846995 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac968620-3704-45c1-9775-d96d60659cc1-trusted-ca\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.846995 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-bound-sa-token\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.847248 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.846996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/357eea12-bfee-4163-a496-c41dc3e15906-data-volume\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.847248 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.847068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/357eea12-bfee-4163-a496-c41dc3e15906-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.847248 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.847097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac968620-3704-45c1-9775-d96d60659cc1-registry-certificates\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.847248 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.847238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/357eea12-bfee-4163-a496-c41dc3e15906-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.847465 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.847280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac968620-3704-45c1-9775-d96d60659cc1-ca-trust-extracted\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.847539 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.847519 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/357eea12-bfee-4163-a496-c41dc3e15906-data-volume\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.848394 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.848364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac968620-3704-45c1-9775-d96d60659cc1-registry-certificates\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.848520 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.848473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac968620-3704-45c1-9775-d96d60659cc1-trusted-ca\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.849193 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.849172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-registry-tls\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.849193 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.849188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac968620-3704-45c1-9775-d96d60659cc1-image-registry-private-configuration\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.849421 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.849406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/357eea12-bfee-4163-a496-c41dc3e15906-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.849465 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.849407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac968620-3704-45c1-9775-d96d60659cc1-installation-pull-secrets\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.856987 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.856965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-284pz\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-kube-api-access-284pz\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.858769 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.858748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82nw8\" (UniqueName: \"kubernetes.io/projected/357eea12-bfee-4163-a496-c41dc3e15906-kube-api-access-82nw8\") pod \"insights-runtime-extractor-4sndf\" (UID: \"357eea12-bfee-4163-a496-c41dc3e15906\") " pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.860282 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.860263 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac968620-3704-45c1-9775-d96d60659cc1-bound-sa-token\") pod \"image-registry-67d8c4d679-5fkvs\" (UID: \"ac968620-3704-45c1-9775-d96d60659cc1\") " pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:23.925949 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.925922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4sndf" Apr 21 03:58:23.992552 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:23.992524 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:24.053941 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:24.053910 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4sndf"] Apr 21 03:58:24.057151 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:24.057124 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod357eea12_bfee_4163_a496_c41dc3e15906.slice/crio-87c94b613ab54bf2f3c9fa412f9cb94115554f2ca3a9720d7e102557433da527 WatchSource:0}: Error finding container 87c94b613ab54bf2f3c9fa412f9cb94115554f2ca3a9720d7e102557433da527: Status 404 returned error can't find the container with id 87c94b613ab54bf2f3c9fa412f9cb94115554f2ca3a9720d7e102557433da527 Apr 21 03:58:24.119993 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:24.119960 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67d8c4d679-5fkvs"] Apr 21 03:58:24.123614 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:24.123589 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac968620_3704_45c1_9775_d96d60659cc1.slice/crio-942f3da9ee7ca32dcf1f46db2b3d3b38980560c3746785d1c935a66cc88172c9 WatchSource:0}: Error finding container 942f3da9ee7ca32dcf1f46db2b3d3b38980560c3746785d1c935a66cc88172c9: Status 404 returned error can't find the container with id 942f3da9ee7ca32dcf1f46db2b3d3b38980560c3746785d1c935a66cc88172c9 Apr 21 03:58:24.196594 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:24.196560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4sndf" event={"ID":"357eea12-bfee-4163-a496-c41dc3e15906","Type":"ContainerStarted","Data":"190b3404c910c8d4df6b6142a1432540a6689b44bb2296cfb79455f9dc926158"} Apr 21 03:58:24.196722 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:24.196599 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4sndf" event={"ID":"357eea12-bfee-4163-a496-c41dc3e15906","Type":"ContainerStarted","Data":"87c94b613ab54bf2f3c9fa412f9cb94115554f2ca3a9720d7e102557433da527"} Apr 21 03:58:24.197823 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:24.197795 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" event={"ID":"ac968620-3704-45c1-9775-d96d60659cc1","Type":"ContainerStarted","Data":"cd15887e7026dc47c471aab3bf3fcfd76876f6bb6665f552f7bf77deb72e3e2a"} Apr 21 03:58:24.197937 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:24.197830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" event={"ID":"ac968620-3704-45c1-9775-d96d60659cc1","Type":"ContainerStarted","Data":"942f3da9ee7ca32dcf1f46db2b3d3b38980560c3746785d1c935a66cc88172c9"} Apr 21 03:58:24.197937 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:24.197925 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:24.219388 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:24.219332 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" podStartSLOduration=1.219301439 podStartE2EDuration="1.219301439s" podCreationTimestamp="2026-04-21 03:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:58:24.219126487 +0000 UTC m=+84.786097017" watchObservedRunningTime="2026-04-21 03:58:24.219301439 +0000 UTC m=+84.786271921" Apr 21 03:58:25.202370 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:25.202332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4sndf" event={"ID":"357eea12-bfee-4163-a496-c41dc3e15906","Type":"ContainerStarted","Data":"25c686ef517990e2192f9a55b0ea83b32950f6cfd56d840dffde386c4a9f6f27"} Apr 21 03:58:26.205995 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:26.205964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4sndf" event={"ID":"357eea12-bfee-4163-a496-c41dc3e15906","Type":"ContainerStarted","Data":"3ab04ad4cca50886622d895758516db46725f2f22b27bb9c613d4e8e946ece0f"} Apr 21 03:58:26.222387 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:26.222347 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4sndf" podStartSLOduration=1.50007898 podStartE2EDuration="3.222334684s" podCreationTimestamp="2026-04-21 03:58:23 +0000 UTC" firstStartedPulling="2026-04-21 03:58:24.139090073 +0000 UTC m=+84.706060544" lastFinishedPulling="2026-04-21 03:58:25.861345775 +0000 UTC m=+86.428316248" observedRunningTime="2026-04-21 03:58:26.220649902 +0000 UTC m=+86.787620394" watchObservedRunningTime="2026-04-21 03:58:26.222334684 +0000 UTC m=+86.789305219" Apr 21 03:58:26.668755 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:26.668725 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:58:26.668932 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:26.668762 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:58:26.669161 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:26.669144 2578 scope.go:117] "RemoveContainer" containerID="b7401567757016312f98821e3a1a7bb4b5930bb9c14c877138d0a87a0d0e60f7" Apr 21 03:58:26.669384 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:26.669366 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-5vjmh_openshift-console-operator(d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" podUID="d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69" Apr 21 03:58:28.186265 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.186205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:58:28.186645 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.186325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:58:28.188748 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.188718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/53593715-fad9-4f5d-8bfa-5579ca4bfd14-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rf7f7\" (UID: \"53593715-fad9-4f5d-8bfa-5579ca4bfd14\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:58:28.188865 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.188749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2846597e-4516-4d1d-9e48-8d7c984b548c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dg8bb\" (UID: \"2846597e-4516-4d1d-9e48-8d7c984b548c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:58:28.217106 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.217081 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-nws68\"" Apr 21 03:58:28.225874 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.225856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" Apr 21 03:58:28.339939 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.339903 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb"] Apr 21 03:58:28.342691 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:28.342663 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2846597e_4516_4d1d_9e48_8d7c984b548c.slice/crio-d30594048dd4de9eb41d9117245470901315ac13f2938ead5d1ee07c8174e7e7 WatchSource:0}: Error finding container d30594048dd4de9eb41d9117245470901315ac13f2938ead5d1ee07c8174e7e7: Status 404 returned error can't find the container with id d30594048dd4de9eb41d9117245470901315ac13f2938ead5d1ee07c8174e7e7 Apr 21 03:58:28.474658 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.474555 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9dm5k\"" Apr 21 03:58:28.483562 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.483545 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" Apr 21 03:58:28.611159 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:28.611129 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7"] Apr 21 03:58:28.614263 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:28.614236 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53593715_fad9_4f5d_8bfa_5579ca4bfd14.slice/crio-729a5370e0e9c9f2cda81ebe4be0e7f46212e1d52917d2bf377fda3410820e5c WatchSource:0}: Error finding container 729a5370e0e9c9f2cda81ebe4be0e7f46212e1d52917d2bf377fda3410820e5c: Status 404 returned error can't find the container with id 729a5370e0e9c9f2cda81ebe4be0e7f46212e1d52917d2bf377fda3410820e5c Apr 21 03:58:29.215520 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:29.215480 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" event={"ID":"53593715-fad9-4f5d-8bfa-5579ca4bfd14","Type":"ContainerStarted","Data":"729a5370e0e9c9f2cda81ebe4be0e7f46212e1d52917d2bf377fda3410820e5c"} Apr 21 03:58:29.217136 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:29.217100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" event={"ID":"2846597e-4516-4d1d-9e48-8d7c984b548c","Type":"ContainerStarted","Data":"d30594048dd4de9eb41d9117245470901315ac13f2938ead5d1ee07c8174e7e7"} Apr 21 03:58:30.220850 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.220808 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" event={"ID":"2846597e-4516-4d1d-9e48-8d7c984b548c","Type":"ContainerStarted","Data":"8cdfd2f1f26bd48d3a6d3f869a2d08c74bfe3792560bc111e71a582cc9d17b29"} Apr 21 03:58:30.236852 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.236802 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dg8bb" podStartSLOduration=33.254730378 podStartE2EDuration="34.236785674s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="2026-04-21 03:58:28.344651604 +0000 UTC m=+88.911622074" lastFinishedPulling="2026-04-21 03:58:29.326706886 +0000 UTC m=+89.893677370" observedRunningTime="2026-04-21 03:58:30.236419278 +0000 UTC m=+90.803389771" watchObservedRunningTime="2026-04-21 03:58:30.236785674 +0000 UTC m=+90.803756167" Apr 21 03:58:30.719123 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.719087 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns"] Apr 21 03:58:30.722225 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.722209 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:30.724355 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.724338 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-df6p6\"" Apr 21 03:58:30.724450 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.724350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 03:58:30.732128 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.732108 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns"] Apr 21 03:58:30.809932 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.809890 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zfnns\" (UID: \"64ed9a8a-2860-46ac-998c-2512e17b4ce8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:30.910917 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:30.910878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zfnns\" (UID: \"64ed9a8a-2860-46ac-998c-2512e17b4ce8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:30.911093 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:30.911036 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 03:58:30.911133 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:30.911104 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates podName:64ed9a8a-2860-46ac-998c-2512e17b4ce8 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:31.411087367 +0000 UTC m=+91.978057857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-zfnns" (UID: "64ed9a8a-2860-46ac-998c-2512e17b4ce8") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 03:58:31.224725 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:31.224687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" event={"ID":"53593715-fad9-4f5d-8bfa-5579ca4bfd14","Type":"ContainerStarted","Data":"6bb3205091bfa723a90ab9722a792ea734c77c0fea9c5b15c4bb4539e4ed1b01"} Apr 21 03:58:31.238648 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:31.238605 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rf7f7" podStartSLOduration=33.655350183 podStartE2EDuration="35.238587119s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="2026-04-21 03:58:28.616058475 +0000 UTC m=+89.183028946" lastFinishedPulling="2026-04-21 03:58:30.199295413 +0000 UTC m=+90.766265882" observedRunningTime="2026-04-21 03:58:31.238203494 +0000 UTC m=+91.805173985" watchObservedRunningTime="2026-04-21 03:58:31.238587119 +0000 UTC m=+91.805557612" Apr 21 03:58:31.414096 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:31.414049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zfnns\" (UID: \"64ed9a8a-2860-46ac-998c-2512e17b4ce8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:31.414287 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:31.414208 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 03:58:31.414357 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:31.414288 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates podName:64ed9a8a-2860-46ac-998c-2512e17b4ce8 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:32.414271332 +0000 UTC m=+92.981241806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-zfnns" (UID: "64ed9a8a-2860-46ac-998c-2512e17b4ce8") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 03:58:32.422542 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:32.422511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zfnns\" (UID: \"64ed9a8a-2860-46ac-998c-2512e17b4ce8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:32.424988 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:32.424961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/64ed9a8a-2860-46ac-998c-2512e17b4ce8-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zfnns\" (UID: \"64ed9a8a-2860-46ac-998c-2512e17b4ce8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:32.531160 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:32.531132 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:32.647685 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:32.647654 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns"] Apr 21 03:58:32.650939 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:32.650913 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64ed9a8a_2860_46ac_998c_2512e17b4ce8.slice/crio-feb19a12d8826e249c13c00a76dcba426bd441bbd5268da42cb6ac697a4b11ca WatchSource:0}: Error finding container feb19a12d8826e249c13c00a76dcba426bd441bbd5268da42cb6ac697a4b11ca: Status 404 returned error can't find the container with id feb19a12d8826e249c13c00a76dcba426bd441bbd5268da42cb6ac697a4b11ca Apr 21 03:58:33.229963 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:33.229932 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" event={"ID":"64ed9a8a-2860-46ac-998c-2512e17b4ce8","Type":"ContainerStarted","Data":"feb19a12d8826e249c13c00a76dcba426bd441bbd5268da42cb6ac697a4b11ca"} Apr 21 03:58:33.635517 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:33.635439 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:34.237772 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.237733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" event={"ID":"64ed9a8a-2860-46ac-998c-2512e17b4ce8","Type":"ContainerStarted","Data":"2ed71d624761043b25dbbcf1ea4caa123156674c947a19435a9e66470c0c059d"} Apr 21 03:58:34.237961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.237942 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:34.242516 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.242497 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" Apr 21 03:58:34.252384 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.252345 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zfnns" podStartSLOduration=2.865443504 podStartE2EDuration="4.252329119s" podCreationTimestamp="2026-04-21 03:58:30 +0000 UTC" firstStartedPulling="2026-04-21 03:58:32.653239029 +0000 UTC m=+93.220209499" lastFinishedPulling="2026-04-21 03:58:34.040124644 +0000 UTC m=+94.607095114" observedRunningTime="2026-04-21 03:58:34.25150838 +0000 UTC m=+94.818478884" watchObservedRunningTime="2026-04-21 03:58:34.252329119 +0000 UTC m=+94.819299611" Apr 21 03:58:34.781251 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.781219 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mh68z"] Apr 21 03:58:34.804347 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.804293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mh68z"] Apr 21 03:58:34.804501 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.804444 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:34.806663 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.806638 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 03:58:34.806663 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.806658 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 03:58:34.806855 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.806692 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 03:58:34.807385 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.807370 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-zrs45\"" Apr 21 03:58:34.942446 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.942403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76cc7\" (UniqueName: \"kubernetes.io/projected/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-kube-api-access-76cc7\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:34.942621 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.942457 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:34.942621 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.942584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:34.942621 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:34.942615 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.043853 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.043779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.043853 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.043813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.044038 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.043856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76cc7\" (UniqueName: \"kubernetes.io/projected/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-kube-api-access-76cc7\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.044038 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.043883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.044038 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:35.044021 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 03:58:35.044144 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:35.044094 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-tls podName:74058be4-ad34-4bb3-a9f8-1a70c3b056f7 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:35.544075912 +0000 UTC m=+96.111046388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-mh68z" (UID: "74058be4-ad34-4bb3-a9f8-1a70c3b056f7") : secret "prometheus-operator-tls" not found Apr 21 03:58:35.044396 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.044377 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.046127 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.046108 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.053557 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.053536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76cc7\" (UniqueName: \"kubernetes.io/projected/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-kube-api-access-76cc7\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.548037 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.547992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.550421 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.550401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/74058be4-ad34-4bb3-a9f8-1a70c3b056f7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mh68z\" (UID: \"74058be4-ad34-4bb3-a9f8-1a70c3b056f7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.713620 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.713579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" Apr 21 03:58:35.828707 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:35.828631 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mh68z"] Apr 21 03:58:35.831545 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:35.831515 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74058be4_ad34_4bb3_a9f8_1a70c3b056f7.slice/crio-69d82e1857fd1d41a8860b6dc7576aa44164c8951d0459205e021e736984413f WatchSource:0}: Error finding container 69d82e1857fd1d41a8860b6dc7576aa44164c8951d0459205e021e736984413f: Status 404 returned error can't find the container with id 69d82e1857fd1d41a8860b6dc7576aa44164c8951d0459205e021e736984413f Apr 21 03:58:36.246298 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.246267 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" event={"ID":"74058be4-ad34-4bb3-a9f8-1a70c3b056f7","Type":"ContainerStarted","Data":"69d82e1857fd1d41a8860b6dc7576aa44164c8951d0459205e021e736984413f"} Apr 21 03:58:36.254818 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.254795 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:58:36.254912 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.254833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:58:36.257180 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.257158 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86dacc78-49b6-4d77-b33d-e5f6f827d63e-metrics-tls\") pod \"dns-default-7mb9p\" (UID: \"86dacc78-49b6-4d77-b33d-e5f6f827d63e\") " pod="openshift-dns/dns-default-7mb9p" Apr 21 03:58:36.257287 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.257239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3286e93-02a3-4094-a61d-5b8ba11a35d6-cert\") pod \"ingress-canary-zbrz9\" (UID: \"c3286e93-02a3-4094-a61d-5b8ba11a35d6\") " pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:58:36.273651 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.273627 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4vtdx\"" Apr 21 03:58:36.282191 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.282176 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zbrz9" Apr 21 03:58:36.402699 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.402663 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zbrz9"] Apr 21 03:58:36.406795 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:36.406751 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3286e93_02a3_4094_a61d_5b8ba11a35d6.slice/crio-177a3c40b815c2ec9de078f7507acc4926d94a5926b7b92cd4389ecdad39c4bb WatchSource:0}: Error finding container 177a3c40b815c2ec9de078f7507acc4926d94a5926b7b92cd4389ecdad39c4bb: Status 404 returned error can't find the container with id 177a3c40b815c2ec9de078f7507acc4926d94a5926b7b92cd4389ecdad39c4bb Apr 21 03:58:36.547900 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.547823 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lrpqk\"" Apr 21 03:58:36.556739 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.556470 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7mb9p" Apr 21 03:58:36.682961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:36.682926 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7mb9p"] Apr 21 03:58:36.687653 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:36.687610 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86dacc78_49b6_4d77_b33d_e5f6f827d63e.slice/crio-76e26e2e21c2cca38658413d49359884fbef03ceac0f91b5862cc27de1e92cee WatchSource:0}: Error finding container 76e26e2e21c2cca38658413d49359884fbef03ceac0f91b5862cc27de1e92cee: Status 404 returned error can't find the container with id 76e26e2e21c2cca38658413d49359884fbef03ceac0f91b5862cc27de1e92cee Apr 21 03:58:37.249962 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:37.249929 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zbrz9" event={"ID":"c3286e93-02a3-4094-a61d-5b8ba11a35d6","Type":"ContainerStarted","Data":"177a3c40b815c2ec9de078f7507acc4926d94a5926b7b92cd4389ecdad39c4bb"} Apr 21 03:58:37.251325 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:37.251231 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7mb9p" event={"ID":"86dacc78-49b6-4d77-b33d-e5f6f827d63e","Type":"ContainerStarted","Data":"76e26e2e21c2cca38658413d49359884fbef03ceac0f91b5862cc27de1e92cee"} Apr 21 03:58:38.256320 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:38.256280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" event={"ID":"74058be4-ad34-4bb3-a9f8-1a70c3b056f7","Type":"ContainerStarted","Data":"a82d3555351969466f413954d0fcd15e7b1e3c69a9aa4f5873c38cdb5fe23938"} Apr 21 03:58:38.256779 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:38.256337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" event={"ID":"74058be4-ad34-4bb3-a9f8-1a70c3b056f7","Type":"ContainerStarted","Data":"507e87b5cb4512d89fa0d730b70e419d81354cf1a6b1f044590260555b831ec5"} Apr 21 03:58:38.271704 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:38.271619 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-mh68z" podStartSLOduration=2.914884763 podStartE2EDuration="4.271601872s" podCreationTimestamp="2026-04-21 03:58:34 +0000 UTC" firstStartedPulling="2026-04-21 03:58:35.833456809 +0000 UTC m=+96.400427280" lastFinishedPulling="2026-04-21 03:58:37.190173917 +0000 UTC m=+97.757144389" observedRunningTime="2026-04-21 03:58:38.270187684 +0000 UTC m=+98.837158182" watchObservedRunningTime="2026-04-21 03:58:38.271601872 +0000 UTC m=+98.838572365" Apr 21 03:58:39.261928 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:39.261887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7mb9p" event={"ID":"86dacc78-49b6-4d77-b33d-e5f6f827d63e","Type":"ContainerStarted","Data":"ee5f096b9dec422d5e8f329114208889c155c264b6ec9978068ecd330f68c908"} Apr 21 03:58:39.261928 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:39.261934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7mb9p" event={"ID":"86dacc78-49b6-4d77-b33d-e5f6f827d63e","Type":"ContainerStarted","Data":"b3800e70d017ec0fac3909edc8dd7736e14e6191da3fa8e381426029cab6d305"} Apr 21 03:58:39.262441 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:39.262008 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7mb9p" Apr 21 03:58:39.263338 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:39.263301 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zbrz9" event={"ID":"c3286e93-02a3-4094-a61d-5b8ba11a35d6","Type":"ContainerStarted","Data":"e02cd0d6b6dfc24cd0387673f5aa1e629738f2d0ce560b89bf0eb62ca8afd682"} Apr 21 03:58:39.277584 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:39.277547 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7mb9p" podStartSLOduration=65.579163797 podStartE2EDuration="1m7.277535622s" podCreationTimestamp="2026-04-21 03:57:32 +0000 UTC" firstStartedPulling="2026-04-21 03:58:36.689684244 +0000 UTC m=+97.256654714" lastFinishedPulling="2026-04-21 03:58:38.388056053 +0000 UTC m=+98.955026539" observedRunningTime="2026-04-21 03:58:39.276901333 +0000 UTC m=+99.843871841" watchObservedRunningTime="2026-04-21 03:58:39.277535622 +0000 UTC m=+99.844506145" Apr 21 03:58:39.292293 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:39.292242 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zbrz9" podStartSLOduration=65.314850013 podStartE2EDuration="1m7.292225362s" podCreationTimestamp="2026-04-21 03:57:32 +0000 UTC" firstStartedPulling="2026-04-21 03:58:36.409119092 +0000 UTC m=+96.976089577" lastFinishedPulling="2026-04-21 03:58:38.386494441 +0000 UTC m=+98.953464926" observedRunningTime="2026-04-21 03:58:39.290603149 +0000 UTC m=+99.857573641" watchObservedRunningTime="2026-04-21 03:58:39.292225362 +0000 UTC m=+99.859195855" Apr 21 03:58:40.151065 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.151029 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vbqg8"] Apr 21 03:58:40.154324 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.154291 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.156021 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.155997 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sql8b" Apr 21 03:58:40.156563 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.156546 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 03:58:40.156666 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.156565 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 03:58:40.156666 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.156628 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pnzbl\"" Apr 21 03:58:40.156666 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.156653 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 03:58:40.188973 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.188942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-textfile\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.189102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.188980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-accelerators-collector-config\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.189102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.189000 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpd98\" (UniqueName: \"kubernetes.io/projected/63d4e126-93be-4cd9-9f82-de3809f011a9-kube-api-access-lpd98\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.189102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.189029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-root\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.189102 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.189083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-tls\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.189345 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.189155 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63d4e126-93be-4cd9-9f82-de3809f011a9-metrics-client-ca\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.189345 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.189221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-wtmp\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.189345 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.189250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.189345 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.189297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-sys\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.290654 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-tls\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.290654 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63d4e126-93be-4cd9-9f82-de3809f011a9-metrics-client-ca\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-wtmp\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290707 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-sys\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-textfile\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-sys\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-accelerators-collector-config\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpd98\" (UniqueName: \"kubernetes.io/projected/63d4e126-93be-4cd9-9f82-de3809f011a9-kube-api-access-lpd98\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-root\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291130 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.290968 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-wtmp\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291493 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.291354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/63d4e126-93be-4cd9-9f82-de3809f011a9-root\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291754 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.291728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-textfile\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.291992 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.291933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63d4e126-93be-4cd9-9f82-de3809f011a9-metrics-client-ca\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.296170 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.292354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-accelerators-collector-config\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.296499 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.296476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-tls\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.296600 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.296540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63d4e126-93be-4cd9-9f82-de3809f011a9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.298775 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.298754 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpd98\" (UniqueName: \"kubernetes.io/projected/63d4e126-93be-4cd9-9f82-de3809f011a9-kube-api-access-lpd98\") pod \"node-exporter-vbqg8\" (UID: \"63d4e126-93be-4cd9-9f82-de3809f011a9\") " pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.463759 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:40.463726 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vbqg8" Apr 21 03:58:40.472324 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:40.472269 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d4e126_93be_4cd9_9f82_de3809f011a9.slice/crio-c386f25cbeb03672d17ed79dd4d0ae6e6a782c66454720a62aabe855f2aeaac5 WatchSource:0}: Error finding container c386f25cbeb03672d17ed79dd4d0ae6e6a782c66454720a62aabe855f2aeaac5: Status 404 returned error can't find the container with id c386f25cbeb03672d17ed79dd4d0ae6e6a782c66454720a62aabe855f2aeaac5 Apr 21 03:58:41.269727 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:41.269685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vbqg8" event={"ID":"63d4e126-93be-4cd9-9f82-de3809f011a9","Type":"ContainerStarted","Data":"c386f25cbeb03672d17ed79dd4d0ae6e6a782c66454720a62aabe855f2aeaac5"} Apr 21 03:58:41.873468 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:41.873392 2578 scope.go:117] "RemoveContainer" containerID="b7401567757016312f98821e3a1a7bb4b5930bb9c14c877138d0a87a0d0e60f7" Apr 21 03:58:42.273818 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:42.273783 2578 generic.go:358] "Generic (PLEG): container finished" podID="63d4e126-93be-4cd9-9f82-de3809f011a9" containerID="18e31291c66f50be1d07a8f8f8e1258a6cfaa64ac9e337e525ebbc3378e0d2ae" exitCode=0 Apr 21 03:58:42.273974 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:42.273883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vbqg8" event={"ID":"63d4e126-93be-4cd9-9f82-de3809f011a9","Type":"ContainerDied","Data":"18e31291c66f50be1d07a8f8f8e1258a6cfaa64ac9e337e525ebbc3378e0d2ae"} Apr 21 03:58:42.275681 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:42.275658 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 03:58:42.275797 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:42.275782 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" event={"ID":"d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69","Type":"ContainerStarted","Data":"513f7b82228f19a85df4aa341a9b2938eb99e0216b7eb89ab2e5e949d45c197d"} Apr 21 03:58:42.276055 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:42.276042 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:58:42.318276 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:42.318220 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" podStartSLOduration=43.271368948 podStartE2EDuration="46.31820205s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="2026-04-21 03:57:56.8173508 +0000 UTC m=+57.384321270" lastFinishedPulling="2026-04-21 03:57:59.864183896 +0000 UTC m=+60.431154372" observedRunningTime="2026-04-21 03:58:42.316477512 +0000 UTC m=+102.883448019" watchObservedRunningTime="2026-04-21 03:58:42.31820205 +0000 UTC m=+102.885172544" Apr 21 03:58:42.505192 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:42.505158 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-5vjmh" Apr 21 03:58:43.282295 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.282260 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vbqg8" event={"ID":"63d4e126-93be-4cd9-9f82-de3809f011a9","Type":"ContainerStarted","Data":"82fb08518c0da14f691790298c2b6300cbafca7d4ed596e8e84bbf2a56477df9"} Apr 21 03:58:43.282732 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.282302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vbqg8" event={"ID":"63d4e126-93be-4cd9-9f82-de3809f011a9","Type":"ContainerStarted","Data":"7059a59001bb6441140db49231626851418cb645269cee53fdec49af6b924329"} Apr 21 03:58:43.303645 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.303592 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vbqg8" podStartSLOduration=2.422966474 podStartE2EDuration="3.30357668s" podCreationTimestamp="2026-04-21 03:58:40 +0000 UTC" firstStartedPulling="2026-04-21 03:58:40.474056803 +0000 UTC m=+101.041027279" lastFinishedPulling="2026-04-21 03:58:41.354667009 +0000 UTC m=+101.921637485" observedRunningTime="2026-04-21 03:58:43.302276158 +0000 UTC m=+103.869246649" watchObservedRunningTime="2026-04-21 03:58:43.30357668 +0000 UTC m=+103.870547171" Apr 21 03:58:43.342828 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.342785 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-74cb4966bb-7r772"] Apr 21 03:58:43.346757 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.346733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.349088 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.349059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2a1bqpbdp2ebv\"" Apr 21 03:58:43.349205 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.349097 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 03:58:43.349205 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.349186 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 03:58:43.349410 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.349394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 03:58:43.349485 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.349413 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 03:58:43.349555 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.349528 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-5n2jg\"" Apr 21 03:58:43.349642 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.349623 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 03:58:43.362103 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.360829 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-74cb4966bb-7r772"] Apr 21 03:58:43.419977 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.419935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-grpc-tls\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.419977 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.419977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np2sj\" (UniqueName: \"kubernetes.io/projected/43d0e693-7ea1-400b-a779-cb496fc9bf3f-kube-api-access-np2sj\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.420189 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.420066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.420189 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.420114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-tls\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.420189 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.420170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.420331 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.420193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.420331 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.420216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.420331 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.420238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43d0e693-7ea1-400b-a779-cb496fc9bf3f-metrics-client-ca\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.521287 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.521249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-grpc-tls\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.521287 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.521288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-np2sj\" (UniqueName: \"kubernetes.io/projected/43d0e693-7ea1-400b-a779-cb496fc9bf3f-kube-api-access-np2sj\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.521540 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.521341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.521540 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.521380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-tls\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.521540 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.521410 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.521540 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.521432 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.521540 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.521467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.521540 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.521502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43d0e693-7ea1-400b-a779-cb496fc9bf3f-metrics-client-ca\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.522445 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.522408 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43d0e693-7ea1-400b-a779-cb496fc9bf3f-metrics-client-ca\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.524295 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.524269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.524711 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.524684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.524850 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.524829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.524911 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.524836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.524911 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.524897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-grpc-tls\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.525012 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.524969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/43d0e693-7ea1-400b-a779-cb496fc9bf3f-secret-thanos-querier-tls\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.527850 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.527831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-np2sj\" (UniqueName: \"kubernetes.io/projected/43d0e693-7ea1-400b-a779-cb496fc9bf3f-kube-api-access-np2sj\") pod \"thanos-querier-74cb4966bb-7r772\" (UID: \"43d0e693-7ea1-400b-a779-cb496fc9bf3f\") " pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.656129 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.656043 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:43.785806 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:43.785770 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-74cb4966bb-7r772"] Apr 21 03:58:43.789270 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:43.789247 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43d0e693_7ea1_400b_a779_cb496fc9bf3f.slice/crio-10b2659b26144be2462bc1bfaaf557e298d6030052a472c6edc42f17f9e8e263 WatchSource:0}: Error finding container 10b2659b26144be2462bc1bfaaf557e298d6030052a472c6edc42f17f9e8e263: Status 404 returned error can't find the container with id 10b2659b26144be2462bc1bfaaf557e298d6030052a472c6edc42f17f9e8e263 Apr 21 03:58:44.286658 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:44.286599 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" event={"ID":"43d0e693-7ea1-400b-a779-cb496fc9bf3f","Type":"ContainerStarted","Data":"10b2659b26144be2462bc1bfaaf557e298d6030052a472c6edc42f17f9e8e263"} Apr 21 03:58:45.207961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.207928 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-67d8c4d679-5fkvs" Apr 21 03:58:45.381274 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.381243 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf"] Apr 21 03:58:45.385804 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.385781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.388046 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.388025 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 03:58:45.388198 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.388174 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 03:58:45.388295 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.388182 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 03:58:45.388295 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.388217 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-wfbnh\"" Apr 21 03:58:45.388295 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.388241 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 03:58:45.388899 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.388618 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 03:58:45.392866 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.392842 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 03:58:45.400460 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.400440 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf"] Apr 21 03:58:45.439485 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.439459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-serving-certs-ca-bundle\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.439584 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.439523 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8tr\" (UniqueName: \"kubernetes.io/projected/7d9538ae-470d-46c0-b07e-62b8c7686666-kube-api-access-6b8tr\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.439638 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.439603 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-metrics-client-ca\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.439699 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.439661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.439763 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.439738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.439824 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.439807 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-secret-telemeter-client\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.439892 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.439856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-federate-client-tls\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.439961 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.439914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-telemeter-client-tls\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.541510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8tr\" (UniqueName: \"kubernetes.io/projected/7d9538ae-470d-46c0-b07e-62b8c7686666-kube-api-access-6b8tr\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.541591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-metrics-client-ca\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.541650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.541696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.541749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-secret-telemeter-client\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.541792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-federate-client-tls\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.541826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-telemeter-client-tls\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.541867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-serving-certs-ca-bundle\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.543018 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.542641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-serving-certs-ca-bundle\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.546574 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.546227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-secret-telemeter-client\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.546746 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.546718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-metrics-client-ca\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.547285 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.547260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d9538ae-470d-46c0-b07e-62b8c7686666-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.548794 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.548748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-telemeter-client-tls\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.549178 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.549156 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-federate-client-tls\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.549570 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.549546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d9538ae-470d-46c0-b07e-62b8c7686666-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.556673 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.556648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8tr\" (UniqueName: \"kubernetes.io/projected/7d9538ae-470d-46c0-b07e-62b8c7686666-kube-api-access-6b8tr\") pod \"telemeter-client-6b5ff8988c-fdvmf\" (UID: \"7d9538ae-470d-46c0-b07e-62b8c7686666\") " pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.712060 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.711972 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" Apr 21 03:58:45.833913 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:45.833884 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf"] Apr 21 03:58:45.836726 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:45.836686 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d9538ae_470d_46c0_b07e_62b8c7686666.slice/crio-4b3120d38d7f34beb7e771b8949b5a180344b943de6d1876698915d6e520b227 WatchSource:0}: Error finding container 4b3120d38d7f34beb7e771b8949b5a180344b943de6d1876698915d6e520b227: Status 404 returned error can't find the container with id 4b3120d38d7f34beb7e771b8949b5a180344b943de6d1876698915d6e520b227 Apr 21 03:58:46.297049 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.296999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" event={"ID":"7d9538ae-470d-46c0-b07e-62b8c7686666","Type":"ContainerStarted","Data":"4b3120d38d7f34beb7e771b8949b5a180344b943de6d1876698915d6e520b227"} Apr 21 03:58:46.304169 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.304123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" event={"ID":"43d0e693-7ea1-400b-a779-cb496fc9bf3f","Type":"ContainerStarted","Data":"cf2ad72dd20d1d90737e9318199405b4e262823facb5da5dce26f9026d99d1f8"} Apr 21 03:58:46.304169 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.304165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" event={"ID":"43d0e693-7ea1-400b-a779-cb496fc9bf3f","Type":"ContainerStarted","Data":"b11fa96a6bf82344871751d4ad540c8db2a94f58a785ad909819828b567b3124"} Apr 21 03:58:46.304449 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.304178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" event={"ID":"43d0e693-7ea1-400b-a779-cb496fc9bf3f","Type":"ContainerStarted","Data":"e9384059ce31ac8e21dfeec799fac34c50dbbb58a4b976795491714c2d5e21eb"} Apr 21 03:58:46.304449 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.304191 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" event={"ID":"43d0e693-7ea1-400b-a779-cb496fc9bf3f","Type":"ContainerStarted","Data":"1eb9b0b2131a6143a47b87019aa337dcd8dd1d715de6ecc0b856bc76a687da3d"} Apr 21 03:58:46.304449 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.304202 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" event={"ID":"43d0e693-7ea1-400b-a779-cb496fc9bf3f","Type":"ContainerStarted","Data":"278befe9ab71c24a3b7b19f15d6b0a9e7511eb5910b9a02d6475e42b7ebd3aec"} Apr 21 03:58:46.304449 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.304212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" event={"ID":"43d0e693-7ea1-400b-a779-cb496fc9bf3f","Type":"ContainerStarted","Data":"1f753f5b9a7a230c727ee882a0505e74bcdceb43bafe608a081c442b0a4dc6a0"} Apr 21 03:58:46.304635 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.304500 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:46.328350 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:46.328108 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" podStartSLOduration=1.018215395 podStartE2EDuration="3.328087642s" podCreationTimestamp="2026-04-21 03:58:43 +0000 UTC" firstStartedPulling="2026-04-21 03:58:43.791529731 +0000 UTC m=+104.358500201" lastFinishedPulling="2026-04-21 03:58:46.101401975 +0000 UTC m=+106.668372448" observedRunningTime="2026-04-21 03:58:46.325351871 +0000 UTC m=+106.892322357" watchObservedRunningTime="2026-04-21 03:58:46.328087642 +0000 UTC m=+106.895058136" Apr 21 03:58:48.312245 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.312212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" event={"ID":"7d9538ae-470d-46c0-b07e-62b8c7686666","Type":"ContainerStarted","Data":"6bff614abbc6b6517c74283df7432a1977fc00ce57f31e66877d15c11b7cb92b"} Apr 21 03:58:48.649352 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.649249 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" podUID="2ed93b1d-890e-40f1-86fc-3b6445fb57ec" containerName="registry" containerID="cri-o://5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0" gracePeriod=30 Apr 21 03:58:48.899542 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.899481 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:48.982244 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982211 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q72zx\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-kube-api-access-q72zx\") pod \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " Apr 21 03:58:48.982420 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982259 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-bound-sa-token\") pod \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " Apr 21 03:58:48.982420 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982326 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") pod \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " Apr 21 03:58:48.982420 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982356 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-ca-trust-extracted\") pod \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " Apr 21 03:58:48.982420 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982413 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-certificates\") pod \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " Apr 21 03:58:48.982585 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982432 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-installation-pull-secrets\") pod \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " Apr 21 03:58:48.982585 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982470 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-trusted-ca\") pod \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " Apr 21 03:58:48.982585 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982489 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-image-registry-private-configuration\") pod \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\" (UID: \"2ed93b1d-890e-40f1-86fc-3b6445fb57ec\") " Apr 21 03:58:48.982937 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982912 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2ed93b1d-890e-40f1-86fc-3b6445fb57ec" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 03:58:48.982937 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.982924 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2ed93b1d-890e-40f1-86fc-3b6445fb57ec" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 03:58:48.985206 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.985093 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2ed93b1d-890e-40f1-86fc-3b6445fb57ec" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 03:58:48.985206 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.985156 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2ed93b1d-890e-40f1-86fc-3b6445fb57ec" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 03:58:48.985206 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.985176 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2ed93b1d-890e-40f1-86fc-3b6445fb57ec" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 03:58:48.985206 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.985193 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-kube-api-access-q72zx" (OuterVolumeSpecName: "kube-api-access-q72zx") pod "2ed93b1d-890e-40f1-86fc-3b6445fb57ec" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec"). InnerVolumeSpecName "kube-api-access-q72zx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 03:58:48.985430 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.985224 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2ed93b1d-890e-40f1-86fc-3b6445fb57ec" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 03:58:48.993801 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:48.993775 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2ed93b1d-890e-40f1-86fc-3b6445fb57ec" (UID: "2ed93b1d-890e-40f1-86fc-3b6445fb57ec"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 03:58:49.083864 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.083824 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-certificates\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:58:49.083864 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.083855 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-installation-pull-secrets\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:58:49.083864 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.083871 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-trusted-ca\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:58:49.084106 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.083908 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-image-registry-private-configuration\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:58:49.084106 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.083922 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q72zx\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-kube-api-access-q72zx\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:58:49.084106 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.083934 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-bound-sa-token\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:58:49.084106 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.083946 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-registry-tls\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:58:49.084106 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.083957 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ed93b1d-890e-40f1-86fc-3b6445fb57ec-ca-trust-extracted\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:58:49.267989 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.267957 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7mb9p" Apr 21 03:58:49.317047 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.317015 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ed93b1d-890e-40f1-86fc-3b6445fb57ec" containerID="5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0" exitCode=0 Apr 21 03:58:49.317478 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.317099 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" Apr 21 03:58:49.317478 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.317103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" event={"ID":"2ed93b1d-890e-40f1-86fc-3b6445fb57ec","Type":"ContainerDied","Data":"5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0"} Apr 21 03:58:49.317478 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.317144 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8655966fcf-5f4kf" event={"ID":"2ed93b1d-890e-40f1-86fc-3b6445fb57ec","Type":"ContainerDied","Data":"a1eae0a9895cd1c3a6ff0e8648a1067add4426c6c433ac7426b217e1f66586df"} Apr 21 03:58:49.317478 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.317166 2578 scope.go:117] "RemoveContainer" containerID="5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0" Apr 21 03:58:49.319444 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.319417 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" event={"ID":"7d9538ae-470d-46c0-b07e-62b8c7686666","Type":"ContainerStarted","Data":"fb921e41ab87bc46bd810ad371fefe0beeb6c04d1b67e6fc217be4ff84b86678"} Apr 21 03:58:49.319560 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.319452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" event={"ID":"7d9538ae-470d-46c0-b07e-62b8c7686666","Type":"ContainerStarted","Data":"26faa39fd52e267af46cab7d046f4a414cec392ec1415658593f88276caa8d1e"} Apr 21 03:58:49.326104 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.325995 2578 scope.go:117] "RemoveContainer" containerID="5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0" Apr 21 03:58:49.326422 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:58:49.326371 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0\": container with ID starting with 5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0 not found: ID does not exist" containerID="5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0" Apr 21 03:58:49.326498 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.326430 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0"} err="failed to get container status \"5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0\": rpc error: code = NotFound desc = could not find container \"5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0\": container with ID starting with 5b4a9097f35d0201f3f997a33315f0d234cf37a50aba00bb5f889d9bfee70cd0 not found: ID does not exist" Apr 21 03:58:49.349422 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.349377 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6b5ff8988c-fdvmf" podStartSLOduration=1.866801024 podStartE2EDuration="4.349363552s" podCreationTimestamp="2026-04-21 03:58:45 +0000 UTC" firstStartedPulling="2026-04-21 03:58:45.838850851 +0000 UTC m=+106.405821324" lastFinishedPulling="2026-04-21 03:58:48.321413368 +0000 UTC m=+108.888383852" observedRunningTime="2026-04-21 03:58:49.342359645 +0000 UTC m=+109.909330136" watchObservedRunningTime="2026-04-21 03:58:49.349363552 +0000 UTC m=+109.916334043" Apr 21 03:58:49.357736 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.357708 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8655966fcf-5f4kf"] Apr 21 03:58:49.369635 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.369614 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8655966fcf-5f4kf"] Apr 21 03:58:49.876758 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:49.876727 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed93b1d-890e-40f1-86fc-3b6445fb57ec" path="/var/lib/kubelet/pods/2ed93b1d-890e-40f1-86fc-3b6445fb57ec/volumes" Apr 21 03:58:51.964539 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.964503 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66c66999d4-g5ddn"] Apr 21 03:58:51.965003 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.964793 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ed93b1d-890e-40f1-86fc-3b6445fb57ec" containerName="registry" Apr 21 03:58:51.965003 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.964804 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed93b1d-890e-40f1-86fc-3b6445fb57ec" containerName="registry" Apr 21 03:58:51.965003 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.964860 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ed93b1d-890e-40f1-86fc-3b6445fb57ec" containerName="registry" Apr 21 03:58:51.969729 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.969706 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:51.972375 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.972344 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 03:58:51.972504 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.972391 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 03:58:51.972504 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.972399 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 03:58:51.972504 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.972351 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-rf2j8\"" Apr 21 03:58:51.972504 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.972419 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 03:58:51.972504 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.972344 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 03:58:51.972830 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.972771 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 03:58:51.972830 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.972788 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 03:58:51.977986 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:51.977964 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66c66999d4-g5ddn"] Apr 21 03:58:52.110955 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.110916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-oauth-config\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.110955 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.110952 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-oauth-serving-cert\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.111162 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.111076 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-service-ca\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.111162 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.111111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-serving-cert\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.111162 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.111139 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-console-config\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.111257 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.111167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq7qr\" (UniqueName: \"kubernetes.io/projected/6f3297de-583e-499a-b7e6-72b8366ea40c-kube-api-access-dq7qr\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.212279 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.212244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-serving-cert\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.212279 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.212281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-console-config\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.212533 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.212304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq7qr\" (UniqueName: \"kubernetes.io/projected/6f3297de-583e-499a-b7e6-72b8366ea40c-kube-api-access-dq7qr\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.212533 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.212343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-oauth-config\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.212533 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.212360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-oauth-serving-cert\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.212676 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.212621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-service-ca\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.213109 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.213086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-console-config\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.213204 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.213086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-oauth-serving-cert\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.213204 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.213141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-service-ca\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.214773 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.214722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-oauth-config\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.214855 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.214815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-serving-cert\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.219451 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.219430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq7qr\" (UniqueName: \"kubernetes.io/projected/6f3297de-583e-499a-b7e6-72b8366ea40c-kube-api-access-dq7qr\") pod \"console-66c66999d4-g5ddn\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.279501 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.279465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:58:52.314925 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.314898 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-74cb4966bb-7r772" Apr 21 03:58:52.401745 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:52.401712 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66c66999d4-g5ddn"] Apr 21 03:58:52.404783 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:52.404755 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3297de_583e_499a_b7e6_72b8366ea40c.slice/crio-89a7598505577994d2ae083e904125b6d1e7db6ab74fbeff08e19ee35effcd09 WatchSource:0}: Error finding container 89a7598505577994d2ae083e904125b6d1e7db6ab74fbeff08e19ee35effcd09: Status 404 returned error can't find the container with id 89a7598505577994d2ae083e904125b6d1e7db6ab74fbeff08e19ee35effcd09 Apr 21 03:58:53.334917 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:53.334857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66c66999d4-g5ddn" event={"ID":"6f3297de-583e-499a-b7e6-72b8366ea40c","Type":"ContainerStarted","Data":"89a7598505577994d2ae083e904125b6d1e7db6ab74fbeff08e19ee35effcd09"} Apr 21 03:58:55.345488 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:55.345450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66c66999d4-g5ddn" event={"ID":"6f3297de-583e-499a-b7e6-72b8366ea40c","Type":"ContainerStarted","Data":"58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9"} Apr 21 03:58:55.360599 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:55.360549 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66c66999d4-g5ddn" podStartSLOduration=1.6261978639999999 podStartE2EDuration="4.360521163s" podCreationTimestamp="2026-04-21 03:58:51 +0000 UTC" firstStartedPulling="2026-04-21 03:58:52.406741739 +0000 UTC m=+112.973712208" lastFinishedPulling="2026-04-21 03:58:55.141065037 +0000 UTC m=+115.708035507" observedRunningTime="2026-04-21 03:58:55.358953039 +0000 UTC m=+115.925923530" watchObservedRunningTime="2026-04-21 03:58:55.360521163 +0000 UTC m=+115.927491655" Apr 21 03:58:55.991892 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:55.991860 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-599fcbfcd4-z9bm2"] Apr 21 03:58:55.995237 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:55.995216 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.001971 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.001946 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 03:58:56.005987 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.005870 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-599fcbfcd4-z9bm2"] Apr 21 03:58:56.050940 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.050904 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-oauth-config\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.051090 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.050948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-trusted-ca-bundle\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.051090 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.051022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-console-config\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.051176 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.051096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-serving-cert\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.051176 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.051121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-oauth-serving-cert\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.051176 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.051142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brwfh\" (UniqueName: \"kubernetes.io/projected/c55f4115-695c-4762-b815-674c07a058ac-kube-api-access-brwfh\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.051273 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.051181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-service-ca\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.151853 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.151816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-service-ca\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152012 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.151878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-oauth-config\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152012 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.151909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-trusted-ca-bundle\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152012 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.151956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-console-config\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152012 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.152006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-serving-cert\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152175 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.152029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-oauth-serving-cert\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152175 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.152053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brwfh\" (UniqueName: \"kubernetes.io/projected/c55f4115-695c-4762-b815-674c07a058ac-kube-api-access-brwfh\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152659 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.152629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-service-ca\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152803 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.152715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-oauth-serving-cert\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152870 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.152813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-trusted-ca-bundle\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.152965 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.152944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-console-config\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.154530 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.154507 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-oauth-config\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.154627 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.154582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-serving-cert\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.159569 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.159548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brwfh\" (UniqueName: \"kubernetes.io/projected/c55f4115-695c-4762-b815-674c07a058ac-kube-api-access-brwfh\") pod \"console-599fcbfcd4-z9bm2\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.304757 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.304662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:58:56.425664 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:56.425626 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-599fcbfcd4-z9bm2"] Apr 21 03:58:56.428030 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:58:56.428004 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc55f4115_695c_4762_b815_674c07a058ac.slice/crio-1b1b830fc8bb83e4f2905d0d56e04d2fcaf3ffd3c97ee87a13da6a4fb8834411 WatchSource:0}: Error finding container 1b1b830fc8bb83e4f2905d0d56e04d2fcaf3ffd3c97ee87a13da6a4fb8834411: Status 404 returned error can't find the container with id 1b1b830fc8bb83e4f2905d0d56e04d2fcaf3ffd3c97ee87a13da6a4fb8834411 Apr 21 03:58:57.353327 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:57.353280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-599fcbfcd4-z9bm2" event={"ID":"c55f4115-695c-4762-b815-674c07a058ac","Type":"ContainerStarted","Data":"82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089"} Apr 21 03:58:57.353327 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:57.353329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-599fcbfcd4-z9bm2" event={"ID":"c55f4115-695c-4762-b815-674c07a058ac","Type":"ContainerStarted","Data":"1b1b830fc8bb83e4f2905d0d56e04d2fcaf3ffd3c97ee87a13da6a4fb8834411"} Apr 21 03:58:57.369446 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:58:57.369399 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-599fcbfcd4-z9bm2" podStartSLOduration=2.369385201 podStartE2EDuration="2.369385201s" podCreationTimestamp="2026-04-21 03:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:58:57.368281099 +0000 UTC m=+117.935251587" watchObservedRunningTime="2026-04-21 03:58:57.369385201 +0000 UTC m=+117.936355692" Apr 21 03:59:02.279965 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:02.279929 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:59:02.280428 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:02.280082 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:59:02.284715 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:02.284692 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:59:02.375949 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:02.375921 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:59:06.304989 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:06.304952 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:59:06.304989 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:06.304997 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:59:06.309840 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:06.309814 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:59:06.387819 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:06.387789 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 03:59:06.429071 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:06.429040 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66c66999d4-g5ddn"] Apr 21 03:59:09.780090 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:09.780046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:59:09.782389 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:09.782366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88564462-797f-416f-b56b-0e31e0156815-metrics-certs\") pod \"network-metrics-daemon-wcnkn\" (UID: \"88564462-797f-416f-b56b-0e31e0156815\") " pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:59:09.894501 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:09.894470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rcf6j\"" Apr 21 03:59:09.903246 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:09.903226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wcnkn" Apr 21 03:59:10.020263 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:10.020097 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wcnkn"] Apr 21 03:59:10.023080 ip-10-0-131-182 kubenswrapper[2578]: W0421 03:59:10.023054 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88564462_797f_416f_b56b_0e31e0156815.slice/crio-88b3641731a5289323526e6a9953f0126d36f8a3cd7d0f07dc956919008a9a6a WatchSource:0}: Error finding container 88b3641731a5289323526e6a9953f0126d36f8a3cd7d0f07dc956919008a9a6a: Status 404 returned error can't find the container with id 88b3641731a5289323526e6a9953f0126d36f8a3cd7d0f07dc956919008a9a6a Apr 21 03:59:10.397425 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:10.397395 2578 generic.go:358] "Generic (PLEG): container finished" podID="b31507b9-91ed-4a27-ad54-be88b2865602" containerID="225dbaff54241d6c625ab4f75702092d055fdde90cdb0a83635c6054542d8709" exitCode=0 Apr 21 03:59:10.397617 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:10.397461 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5w6b" event={"ID":"b31507b9-91ed-4a27-ad54-be88b2865602","Type":"ContainerDied","Data":"225dbaff54241d6c625ab4f75702092d055fdde90cdb0a83635c6054542d8709"} Apr 21 03:59:10.397776 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:10.397761 2578 scope.go:117] "RemoveContainer" containerID="225dbaff54241d6c625ab4f75702092d055fdde90cdb0a83635c6054542d8709" Apr 21 03:59:10.398510 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:10.398488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wcnkn" event={"ID":"88564462-797f-416f-b56b-0e31e0156815","Type":"ContainerStarted","Data":"88b3641731a5289323526e6a9953f0126d36f8a3cd7d0f07dc956919008a9a6a"} Apr 21 03:59:11.403580 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:11.403489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-s5w6b" event={"ID":"b31507b9-91ed-4a27-ad54-be88b2865602","Type":"ContainerStarted","Data":"482d0eb771a57299b2ceef9b5c184bf751e6856b1b13a87ef76eed61fe8eb538"} Apr 21 03:59:11.405228 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:11.405204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wcnkn" event={"ID":"88564462-797f-416f-b56b-0e31e0156815","Type":"ContainerStarted","Data":"ae7864f41421d846f2e8ff25766712cc761a7d06fb6d6600543c056ce832faca"} Apr 21 03:59:11.405359 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:11.405234 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wcnkn" event={"ID":"88564462-797f-416f-b56b-0e31e0156815","Type":"ContainerStarted","Data":"d1ed8a0378fe3937a8dbfe6d83ba6571a1250907cd716e1c6a47f3c19e730907"} Apr 21 03:59:11.431821 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:11.431757 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wcnkn" podStartSLOduration=130.411822723 podStartE2EDuration="2m11.431738227s" podCreationTimestamp="2026-04-21 03:57:00 +0000 UTC" firstStartedPulling="2026-04-21 03:59:10.025414496 +0000 UTC m=+130.592384966" lastFinishedPulling="2026-04-21 03:59:11.045329994 +0000 UTC m=+131.612300470" observedRunningTime="2026-04-21 03:59:11.43013611 +0000 UTC m=+131.997106604" watchObservedRunningTime="2026-04-21 03:59:11.431738227 +0000 UTC m=+131.998708721" Apr 21 03:59:21.435196 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:21.435159 2578 generic.go:358] "Generic (PLEG): container finished" podID="bdaaadc4-2dd1-4be8-955c-755deb5df200" containerID="74e72e906f5ed1f52c808ffc8aa7c996934191c7234ace875e8b97fe9336308f" exitCode=0 Apr 21 03:59:21.435647 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:21.435228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" event={"ID":"bdaaadc4-2dd1-4be8-955c-755deb5df200","Type":"ContainerDied","Data":"74e72e906f5ed1f52c808ffc8aa7c996934191c7234ace875e8b97fe9336308f"} Apr 21 03:59:21.435647 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:21.435595 2578 scope.go:117] "RemoveContainer" containerID="74e72e906f5ed1f52c808ffc8aa7c996934191c7234ace875e8b97fe9336308f" Apr 21 03:59:22.439402 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:22.439366 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-2nv4r" event={"ID":"bdaaadc4-2dd1-4be8-955c-755deb5df200","Type":"ContainerStarted","Data":"cfecbea8560ebb617ff8538da8f6f70cc964d6b9d847292a7e65653e2148d627"} Apr 21 03:59:31.448187 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.448146 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66c66999d4-g5ddn" podUID="6f3297de-583e-499a-b7e6-72b8366ea40c" containerName="console" containerID="cri-o://58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9" gracePeriod=15 Apr 21 03:59:31.689771 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.689749 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66c66999d4-g5ddn_6f3297de-583e-499a-b7e6-72b8366ea40c/console/0.log" Apr 21 03:59:31.689904 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.689810 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:59:31.735230 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735138 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-oauth-serving-cert\") pod \"6f3297de-583e-499a-b7e6-72b8366ea40c\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " Apr 21 03:59:31.735407 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735275 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-console-config\") pod \"6f3297de-583e-499a-b7e6-72b8366ea40c\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " Apr 21 03:59:31.735407 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735343 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq7qr\" (UniqueName: \"kubernetes.io/projected/6f3297de-583e-499a-b7e6-72b8366ea40c-kube-api-access-dq7qr\") pod \"6f3297de-583e-499a-b7e6-72b8366ea40c\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " Apr 21 03:59:31.735407 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735403 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-oauth-config\") pod \"6f3297de-583e-499a-b7e6-72b8366ea40c\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " Apr 21 03:59:31.735570 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735433 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-serving-cert\") pod \"6f3297de-583e-499a-b7e6-72b8366ea40c\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " Apr 21 03:59:31.735570 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735469 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-service-ca\") pod \"6f3297de-583e-499a-b7e6-72b8366ea40c\" (UID: \"6f3297de-583e-499a-b7e6-72b8366ea40c\") " Apr 21 03:59:31.735570 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735492 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6f3297de-583e-499a-b7e6-72b8366ea40c" (UID: "6f3297de-583e-499a-b7e6-72b8366ea40c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 03:59:31.735704 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735640 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-console-config" (OuterVolumeSpecName: "console-config") pod "6f3297de-583e-499a-b7e6-72b8366ea40c" (UID: "6f3297de-583e-499a-b7e6-72b8366ea40c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 03:59:31.735756 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735742 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-oauth-serving-cert\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:59:31.735805 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735760 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-console-config\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:59:31.735934 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.735914 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-service-ca" (OuterVolumeSpecName: "service-ca") pod "6f3297de-583e-499a-b7e6-72b8366ea40c" (UID: "6f3297de-583e-499a-b7e6-72b8366ea40c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 03:59:31.737511 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.737479 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6f3297de-583e-499a-b7e6-72b8366ea40c" (UID: "6f3297de-583e-499a-b7e6-72b8366ea40c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 03:59:31.737609 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.737528 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6f3297de-583e-499a-b7e6-72b8366ea40c" (UID: "6f3297de-583e-499a-b7e6-72b8366ea40c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 03:59:31.737609 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.737602 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3297de-583e-499a-b7e6-72b8366ea40c-kube-api-access-dq7qr" (OuterVolumeSpecName: "kube-api-access-dq7qr") pod "6f3297de-583e-499a-b7e6-72b8366ea40c" (UID: "6f3297de-583e-499a-b7e6-72b8366ea40c"). InnerVolumeSpecName "kube-api-access-dq7qr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 03:59:31.836605 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.836572 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dq7qr\" (UniqueName: \"kubernetes.io/projected/6f3297de-583e-499a-b7e6-72b8366ea40c-kube-api-access-dq7qr\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:59:31.836605 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.836600 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-oauth-config\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:59:31.836605 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.836609 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f3297de-583e-499a-b7e6-72b8366ea40c-console-serving-cert\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:59:31.836825 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:31.836618 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f3297de-583e-499a-b7e6-72b8366ea40c-service-ca\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 03:59:32.470033 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.470005 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66c66999d4-g5ddn_6f3297de-583e-499a-b7e6-72b8366ea40c/console/0.log" Apr 21 03:59:32.470422 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.470047 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f3297de-583e-499a-b7e6-72b8366ea40c" containerID="58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9" exitCode=2 Apr 21 03:59:32.470422 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.470120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66c66999d4-g5ddn" event={"ID":"6f3297de-583e-499a-b7e6-72b8366ea40c","Type":"ContainerDied","Data":"58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9"} Apr 21 03:59:32.470422 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.470140 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66c66999d4-g5ddn" Apr 21 03:59:32.470422 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.470158 2578 scope.go:117] "RemoveContainer" containerID="58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9" Apr 21 03:59:32.470422 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.470148 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66c66999d4-g5ddn" event={"ID":"6f3297de-583e-499a-b7e6-72b8366ea40c","Type":"ContainerDied","Data":"89a7598505577994d2ae083e904125b6d1e7db6ab74fbeff08e19ee35effcd09"} Apr 21 03:59:32.478136 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.478116 2578 scope.go:117] "RemoveContainer" containerID="58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9" Apr 21 03:59:32.478421 ip-10-0-131-182 kubenswrapper[2578]: E0421 03:59:32.478400 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9\": container with ID starting with 58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9 not found: ID does not exist" containerID="58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9" Apr 21 03:59:32.478483 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.478428 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9"} err="failed to get container status \"58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9\": rpc error: code = NotFound desc = could not find container \"58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9\": container with ID starting with 58de481da654f96e53af71885e2042646341732adb17dafd7771cad840d195f9 not found: ID does not exist" Apr 21 03:59:32.484746 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.484727 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66c66999d4-g5ddn"] Apr 21 03:59:32.488638 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:32.488616 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66c66999d4-g5ddn"] Apr 21 03:59:33.876131 ip-10-0-131-182 kubenswrapper[2578]: I0421 03:59:33.876094 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3297de-583e-499a-b7e6-72b8366ea40c" path="/var/lib/kubelet/pods/6f3297de-583e-499a-b7e6-72b8366ea40c/volumes" Apr 21 04:00:02.405324 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.405287 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:02.405894 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.405648 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f3297de-583e-499a-b7e6-72b8366ea40c" containerName="console" Apr 21 04:00:02.405894 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.405665 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3297de-583e-499a-b7e6-72b8366ea40c" containerName="console" Apr 21 04:00:02.405894 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.405722 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f3297de-583e-499a-b7e6-72b8366ea40c" containerName="console" Apr 21 04:00:02.410491 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.410468 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.412854 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.412816 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 04:00:02.412854 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.412845 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 04:00:02.413045 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.412860 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 04:00:02.413045 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.412921 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 04:00:02.413045 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.412958 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 04:00:02.413222 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.413208 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 04:00:02.413280 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.413228 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-qdl9m\"" Apr 21 04:00:02.413280 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.413241 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 04:00:02.413402 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.413376 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 04:00:02.419944 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.419916 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 04:00:02.420070 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.419991 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:02.489548 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-web-config\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489548 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489550 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fd041aa4-3684-4ba0-ac16-42f892202ca4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489778 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd041aa4-3684-4ba0-ac16-42f892202ca4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489778 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd041aa4-3684-4ba0-ac16-42f892202ca4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489778 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489778 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489769 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489919 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd041aa4-3684-4ba0-ac16-42f892202ca4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489919 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489919 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.489919 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-config-volume\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.490065 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.490065 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489953 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n26qm\" (UniqueName: \"kubernetes.io/projected/fd041aa4-3684-4ba0-ac16-42f892202ca4-kube-api-access-n26qm\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.490065 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.489973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd041aa4-3684-4ba0-ac16-42f892202ca4-config-out\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591176 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591138 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591404 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591404 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-config-volume\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591404 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591404 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n26qm\" (UniqueName: \"kubernetes.io/projected/fd041aa4-3684-4ba0-ac16-42f892202ca4-kube-api-access-n26qm\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591404 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd041aa4-3684-4ba0-ac16-42f892202ca4-config-out\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591404 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-web-config\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591726 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fd041aa4-3684-4ba0-ac16-42f892202ca4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591726 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd041aa4-3684-4ba0-ac16-42f892202ca4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591726 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591530 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd041aa4-3684-4ba0-ac16-42f892202ca4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591726 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591726 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.591726 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.591624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd041aa4-3684-4ba0-ac16-42f892202ca4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.592465 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.592438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd041aa4-3684-4ba0-ac16-42f892202ca4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.593597 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.593568 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fd041aa4-3684-4ba0-ac16-42f892202ca4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.594510 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.594486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.594752 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.594734 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd041aa4-3684-4ba0-ac16-42f892202ca4-config-out\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.594867 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.594770 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd041aa4-3684-4ba0-ac16-42f892202ca4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.594988 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.594786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-config-volume\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.594988 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.594861 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-web-config\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.595141 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.594872 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.595178 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.595151 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd041aa4-3684-4ba0-ac16-42f892202ca4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.595558 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.595536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.596543 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.596518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.596860 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.596840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd041aa4-3684-4ba0-ac16-42f892202ca4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.598138 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.598114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n26qm\" (UniqueName: \"kubernetes.io/projected/fd041aa4-3684-4ba0-ac16-42f892202ca4-kube-api-access-n26qm\") pod \"alertmanager-main-0\" (UID: \"fd041aa4-3684-4ba0-ac16-42f892202ca4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.720435 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.720395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.853620 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:02.853572 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:02.856515 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:00:02.856485 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd041aa4_3684_4ba0_ac16_42f892202ca4.slice/crio-b808a273e37c8533f94bc2ea1ce81c3f3e14f7e2ab40a4a3a464eee66e9481be WatchSource:0}: Error finding container b808a273e37c8533f94bc2ea1ce81c3f3e14f7e2ab40a4a3a464eee66e9481be: Status 404 returned error can't find the container with id b808a273e37c8533f94bc2ea1ce81c3f3e14f7e2ab40a4a3a464eee66e9481be Apr 21 04:00:03.563326 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:03.563283 2578 generic.go:358] "Generic (PLEG): container finished" podID="fd041aa4-3684-4ba0-ac16-42f892202ca4" containerID="6d9797d36ff54c6abf6d08c4f85f44ae535ff0aba41b32d4f092b44b8b4b7841" exitCode=0 Apr 21 04:00:03.563698 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:03.563342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd041aa4-3684-4ba0-ac16-42f892202ca4","Type":"ContainerDied","Data":"6d9797d36ff54c6abf6d08c4f85f44ae535ff0aba41b32d4f092b44b8b4b7841"} Apr 21 04:00:03.563698 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:03.563387 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd041aa4-3684-4ba0-ac16-42f892202ca4","Type":"ContainerStarted","Data":"b808a273e37c8533f94bc2ea1ce81c3f3e14f7e2ab40a4a3a464eee66e9481be"} Apr 21 04:00:05.572532 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:05.572480 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd041aa4-3684-4ba0-ac16-42f892202ca4","Type":"ContainerStarted","Data":"6cda3018ed2c47cffcace068c6bf2789061920070e8281d6bb17fbb2830495d1"} Apr 21 04:00:05.572532 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:05.572525 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd041aa4-3684-4ba0-ac16-42f892202ca4","Type":"ContainerStarted","Data":"560de7dafa37cce01782356946e451b08fd33aaecc97404968c07c8663519301"} Apr 21 04:00:05.572532 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:05.572537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd041aa4-3684-4ba0-ac16-42f892202ca4","Type":"ContainerStarted","Data":"555f41366f99807e367d8c6049f7efa2b17b48ab8a013e046cba6d8256dafc62"} Apr 21 04:00:05.573095 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:05.572545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd041aa4-3684-4ba0-ac16-42f892202ca4","Type":"ContainerStarted","Data":"0a00f43338b479bfb316b652dce86e4788a8261797e91fe77705f71fed3b48d8"} Apr 21 04:00:05.573095 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:05.572553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd041aa4-3684-4ba0-ac16-42f892202ca4","Type":"ContainerStarted","Data":"1749c11d0c43dbf78529abfbd9bf21e6220610ccecc29b5aa6163c0db9dc9c38"} Apr 21 04:00:05.573095 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:05.572561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fd041aa4-3684-4ba0-ac16-42f892202ca4","Type":"ContainerStarted","Data":"772104b93f3c76cad024dd51d8543f6cdd2de042066752b18b8698231c2444c8"} Apr 21 04:00:05.597208 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:05.597105 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.083128273 podStartE2EDuration="3.59708889s" podCreationTimestamp="2026-04-21 04:00:02 +0000 UTC" firstStartedPulling="2026-04-21 04:00:03.564516175 +0000 UTC m=+184.131486656" lastFinishedPulling="2026-04-21 04:00:05.078476799 +0000 UTC m=+185.645447273" observedRunningTime="2026-04-21 04:00:05.594750201 +0000 UTC m=+186.161720694" watchObservedRunningTime="2026-04-21 04:00:05.59708889 +0000 UTC m=+186.164059382" Apr 21 04:00:35.882864 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.882832 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-686976b7d5-7c8v6"] Apr 21 04:00:35.890262 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.890232 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:35.899033 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.898566 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-686976b7d5-7c8v6"] Apr 21 04:00:35.966855 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.966817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-service-ca\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:35.967041 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.966869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dhff\" (UniqueName: \"kubernetes.io/projected/7741d138-b730-4f49-987e-75d6648f19f6-kube-api-access-4dhff\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:35.967041 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.966920 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-oauth-config\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:35.967041 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.967015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-console-config\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:35.967150 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.967066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-trusted-ca-bundle\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:35.967150 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.967114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-serving-cert\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:35.967150 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:35.967130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-oauth-serving-cert\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.068398 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.068359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-console-config\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.068601 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.068413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-trusted-ca-bundle\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.068601 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.068455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-serving-cert\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.068601 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.068471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-oauth-serving-cert\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.068601 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.068490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-service-ca\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.068601 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.068514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dhff\" (UniqueName: \"kubernetes.io/projected/7741d138-b730-4f49-987e-75d6648f19f6-kube-api-access-4dhff\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.068601 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.068541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-oauth-config\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.069820 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.069784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-oauth-serving-cert\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.069939 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.069896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-console-config\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.069939 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.069903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-service-ca\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.070013 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.069970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-trusted-ca-bundle\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.071674 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.071652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-serving-cert\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.071769 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.071683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-oauth-config\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.076138 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.076110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dhff\" (UniqueName: \"kubernetes.io/projected/7741d138-b730-4f49-987e-75d6648f19f6-kube-api-access-4dhff\") pod \"console-686976b7d5-7c8v6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.202364 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.202301 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:36.326246 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.326209 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-686976b7d5-7c8v6"] Apr 21 04:00:36.329591 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:00:36.329560 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7741d138_b730_4f49_987e_75d6648f19f6.slice/crio-3db1a3c2cef59f403c6adaededc7f8d8a7d4f8c0cffc7066ac56e1f1fad5eae3 WatchSource:0}: Error finding container 3db1a3c2cef59f403c6adaededc7f8d8a7d4f8c0cffc7066ac56e1f1fad5eae3: Status 404 returned error can't find the container with id 3db1a3c2cef59f403c6adaededc7f8d8a7d4f8c0cffc7066ac56e1f1fad5eae3 Apr 21 04:00:36.665774 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.665740 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-686976b7d5-7c8v6" event={"ID":"7741d138-b730-4f49-987e-75d6648f19f6","Type":"ContainerStarted","Data":"1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f"} Apr 21 04:00:36.665946 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.665783 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-686976b7d5-7c8v6" event={"ID":"7741d138-b730-4f49-987e-75d6648f19f6","Type":"ContainerStarted","Data":"3db1a3c2cef59f403c6adaededc7f8d8a7d4f8c0cffc7066ac56e1f1fad5eae3"} Apr 21 04:00:36.681993 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:36.681933 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-686976b7d5-7c8v6" podStartSLOduration=1.6819135790000002 podStartE2EDuration="1.681913579s" podCreationTimestamp="2026-04-21 04:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:00:36.680418763 +0000 UTC m=+217.247389259" watchObservedRunningTime="2026-04-21 04:00:36.681913579 +0000 UTC m=+217.248884073" Apr 21 04:00:46.203420 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:46.203380 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:46.203420 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:46.203428 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:46.208208 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:46.208184 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:46.707276 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:46.707244 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:00:46.750230 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:00:46.750197 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-599fcbfcd4-z9bm2"] Apr 21 04:01:11.775689 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:11.775633 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-599fcbfcd4-z9bm2" podUID="c55f4115-695c-4762-b815-674c07a058ac" containerName="console" containerID="cri-o://82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089" gracePeriod=15 Apr 21 04:01:12.011891 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.011869 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-599fcbfcd4-z9bm2_c55f4115-695c-4762-b815-674c07a058ac/console/0.log" Apr 21 04:01:12.012009 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.011929 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 04:01:12.179043 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179010 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brwfh\" (UniqueName: \"kubernetes.io/projected/c55f4115-695c-4762-b815-674c07a058ac-kube-api-access-brwfh\") pod \"c55f4115-695c-4762-b815-674c07a058ac\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " Apr 21 04:01:12.179201 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179057 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-serving-cert\") pod \"c55f4115-695c-4762-b815-674c07a058ac\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " Apr 21 04:01:12.179201 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179080 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-oauth-config\") pod \"c55f4115-695c-4762-b815-674c07a058ac\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " Apr 21 04:01:12.179305 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179206 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-trusted-ca-bundle\") pod \"c55f4115-695c-4762-b815-674c07a058ac\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " Apr 21 04:01:12.179305 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179241 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-service-ca\") pod \"c55f4115-695c-4762-b815-674c07a058ac\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " Apr 21 04:01:12.179305 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179282 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-console-config\") pod \"c55f4115-695c-4762-b815-674c07a058ac\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " Apr 21 04:01:12.179493 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179361 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-oauth-serving-cert\") pod \"c55f4115-695c-4762-b815-674c07a058ac\" (UID: \"c55f4115-695c-4762-b815-674c07a058ac\") " Apr 21 04:01:12.179693 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179662 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-service-ca" (OuterVolumeSpecName: "service-ca") pod "c55f4115-695c-4762-b815-674c07a058ac" (UID: "c55f4115-695c-4762-b815-674c07a058ac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:12.179797 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179670 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c55f4115-695c-4762-b815-674c07a058ac" (UID: "c55f4115-695c-4762-b815-674c07a058ac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:12.179797 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179700 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-console-config" (OuterVolumeSpecName: "console-config") pod "c55f4115-695c-4762-b815-674c07a058ac" (UID: "c55f4115-695c-4762-b815-674c07a058ac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:12.179958 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.179932 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c55f4115-695c-4762-b815-674c07a058ac" (UID: "c55f4115-695c-4762-b815-674c07a058ac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:12.181282 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.181255 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c55f4115-695c-4762-b815-674c07a058ac" (UID: "c55f4115-695c-4762-b815-674c07a058ac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:12.181406 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.181299 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c55f4115-695c-4762-b815-674c07a058ac" (UID: "c55f4115-695c-4762-b815-674c07a058ac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:12.181406 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.181360 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55f4115-695c-4762-b815-674c07a058ac-kube-api-access-brwfh" (OuterVolumeSpecName: "kube-api-access-brwfh") pod "c55f4115-695c-4762-b815-674c07a058ac" (UID: "c55f4115-695c-4762-b815-674c07a058ac"). InnerVolumeSpecName "kube-api-access-brwfh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:01:12.280305 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.280272 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-serving-cert\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:12.280305 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.280303 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c55f4115-695c-4762-b815-674c07a058ac-console-oauth-config\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:12.280537 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.280333 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-trusted-ca-bundle\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:12.280537 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.280347 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-service-ca\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:12.280537 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.280358 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-console-config\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:12.280537 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.280369 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c55f4115-695c-4762-b815-674c07a058ac-oauth-serving-cert\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:12.280537 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.280380 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-brwfh\" (UniqueName: \"kubernetes.io/projected/c55f4115-695c-4762-b815-674c07a058ac-kube-api-access-brwfh\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:12.778066 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.778039 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-599fcbfcd4-z9bm2_c55f4115-695c-4762-b815-674c07a058ac/console/0.log" Apr 21 04:01:12.778475 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.778078 2578 generic.go:358] "Generic (PLEG): container finished" podID="c55f4115-695c-4762-b815-674c07a058ac" containerID="82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089" exitCode=2 Apr 21 04:01:12.778475 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.778139 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-599fcbfcd4-z9bm2" Apr 21 04:01:12.778475 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.778172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-599fcbfcd4-z9bm2" event={"ID":"c55f4115-695c-4762-b815-674c07a058ac","Type":"ContainerDied","Data":"82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089"} Apr 21 04:01:12.778475 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.778209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-599fcbfcd4-z9bm2" event={"ID":"c55f4115-695c-4762-b815-674c07a058ac","Type":"ContainerDied","Data":"1b1b830fc8bb83e4f2905d0d56e04d2fcaf3ffd3c97ee87a13da6a4fb8834411"} Apr 21 04:01:12.778475 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.778225 2578 scope.go:117] "RemoveContainer" containerID="82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089" Apr 21 04:01:12.786835 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.786818 2578 scope.go:117] "RemoveContainer" containerID="82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089" Apr 21 04:01:12.787052 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:12.787032 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089\": container with ID starting with 82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089 not found: ID does not exist" containerID="82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089" Apr 21 04:01:12.787101 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.787060 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089"} err="failed to get container status \"82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089\": rpc error: code = NotFound desc = could not find container \"82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089\": container with ID starting with 82763399d6c310de4681853afba2047390156a727d5f319f2d87204528564089 not found: ID does not exist" Apr 21 04:01:12.798217 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.798196 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-599fcbfcd4-z9bm2"] Apr 21 04:01:12.803349 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:12.803328 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-599fcbfcd4-z9bm2"] Apr 21 04:01:13.876015 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:13.875983 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55f4115-695c-4762-b815-674c07a058ac" path="/var/lib/kubelet/pods/c55f4115-695c-4762-b815-674c07a058ac/volumes" Apr 21 04:01:17.516283 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.516245 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn"] Apr 21 04:01:17.516671 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.516592 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c55f4115-695c-4762-b815-674c07a058ac" containerName="console" Apr 21 04:01:17.516671 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.516604 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55f4115-695c-4762-b815-674c07a058ac" containerName="console" Apr 21 04:01:17.516671 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.516668 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c55f4115-695c-4762-b815-674c07a058ac" containerName="console" Apr 21 04:01:17.521153 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.521136 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.523216 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.523197 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:01:17.523812 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.523796 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:01:17.523894 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.523803 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-shzhf\"" Apr 21 04:01:17.525256 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.525235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.525376 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.525292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4xh\" (UniqueName: \"kubernetes.io/projected/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-kube-api-access-cd4xh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.525484 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.525459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.527362 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.527304 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn"] Apr 21 04:01:17.625920 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.625881 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd4xh\" (UniqueName: \"kubernetes.io/projected/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-kube-api-access-cd4xh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.626107 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.625950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.626107 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.625986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.626384 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.626361 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.626420 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.626374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.633738 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.633716 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd4xh\" (UniqueName: \"kubernetes.io/projected/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-kube-api-access-cd4xh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.831083 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.830993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:17.980003 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:17.979975 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn"] Apr 21 04:01:17.982634 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:01:17.982605 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f78f0a_3b91_431d_b4f9_dbb1765b39b9.slice/crio-046f11acc912db181e735861018a81048b028fadf72ab8b82386e1d193ee7342 WatchSource:0}: Error finding container 046f11acc912db181e735861018a81048b028fadf72ab8b82386e1d193ee7342: Status 404 returned error can't find the container with id 046f11acc912db181e735861018a81048b028fadf72ab8b82386e1d193ee7342 Apr 21 04:01:18.797521 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:18.797438 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" event={"ID":"67f78f0a-3b91-431d-b4f9-dbb1765b39b9","Type":"ContainerStarted","Data":"046f11acc912db181e735861018a81048b028fadf72ab8b82386e1d193ee7342"} Apr 21 04:01:23.815704 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:23.815612 2578 generic.go:358] "Generic (PLEG): container finished" podID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerID="cb36fdea1ab0d83cc99e48b0e743047be35a34362d7cdc0bda11b8c27137a802" exitCode=0 Apr 21 04:01:23.816049 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:23.815699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" event={"ID":"67f78f0a-3b91-431d-b4f9-dbb1765b39b9","Type":"ContainerDied","Data":"cb36fdea1ab0d83cc99e48b0e743047be35a34362d7cdc0bda11b8c27137a802"} Apr 21 04:01:26.827401 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:26.827367 2578 generic.go:358] "Generic (PLEG): container finished" podID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerID="a537f4cc313c4e4842854d050c2d1bcedca53c1dd2847fa9a23c1a12edd4eff0" exitCode=0 Apr 21 04:01:26.827401 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:26.827404 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" event={"ID":"67f78f0a-3b91-431d-b4f9-dbb1765b39b9","Type":"ContainerDied","Data":"a537f4cc313c4e4842854d050c2d1bcedca53c1dd2847fa9a23c1a12edd4eff0"} Apr 21 04:01:33.852306 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:33.852267 2578 generic.go:358] "Generic (PLEG): container finished" podID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerID="893bf8168ee5ef9cbeb9508e38edd3db45a9babe364bc2084b4c219b70352b21" exitCode=0 Apr 21 04:01:33.852707 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:33.852349 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" event={"ID":"67f78f0a-3b91-431d-b4f9-dbb1765b39b9","Type":"ContainerDied","Data":"893bf8168ee5ef9cbeb9508e38edd3db45a9babe364bc2084b4c219b70352b21"} Apr 21 04:01:34.972408 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:34.972381 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:35.085014 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.084985 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd4xh\" (UniqueName: \"kubernetes.io/projected/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-kube-api-access-cd4xh\") pod \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " Apr 21 04:01:35.085177 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.085044 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-util\") pod \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " Apr 21 04:01:35.085177 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.085082 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-bundle\") pod \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\" (UID: \"67f78f0a-3b91-431d-b4f9-dbb1765b39b9\") " Apr 21 04:01:35.085793 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.085759 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-bundle" (OuterVolumeSpecName: "bundle") pod "67f78f0a-3b91-431d-b4f9-dbb1765b39b9" (UID: "67f78f0a-3b91-431d-b4f9-dbb1765b39b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:01:35.087186 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.087162 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-kube-api-access-cd4xh" (OuterVolumeSpecName: "kube-api-access-cd4xh") pod "67f78f0a-3b91-431d-b4f9-dbb1765b39b9" (UID: "67f78f0a-3b91-431d-b4f9-dbb1765b39b9"). InnerVolumeSpecName "kube-api-access-cd4xh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:01:35.089421 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.089403 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-util" (OuterVolumeSpecName: "util") pod "67f78f0a-3b91-431d-b4f9-dbb1765b39b9" (UID: "67f78f0a-3b91-431d-b4f9-dbb1765b39b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:01:35.186547 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.186517 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-util\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:35.186683 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.186567 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-bundle\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:35.186683 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.186577 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cd4xh\" (UniqueName: \"kubernetes.io/projected/67f78f0a-3b91-431d-b4f9-dbb1765b39b9-kube-api-access-cd4xh\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:01:35.859621 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.859588 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" event={"ID":"67f78f0a-3b91-431d-b4f9-dbb1765b39b9","Type":"ContainerDied","Data":"046f11acc912db181e735861018a81048b028fadf72ab8b82386e1d193ee7342"} Apr 21 04:01:35.859621 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.859619 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="046f11acc912db181e735861018a81048b028fadf72ab8b82386e1d193ee7342" Apr 21 04:01:35.859819 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:35.859641 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c5szbn" Apr 21 04:01:38.893159 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.893123 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb"] Apr 21 04:01:38.893579 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.893561 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerName="pull" Apr 21 04:01:38.893626 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.893584 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerName="pull" Apr 21 04:01:38.893626 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.893601 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerName="util" Apr 21 04:01:38.893626 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.893609 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerName="util" Apr 21 04:01:38.893718 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.893636 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerName="extract" Apr 21 04:01:38.893718 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.893645 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerName="extract" Apr 21 04:01:38.893787 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.893753 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="67f78f0a-3b91-431d-b4f9-dbb1765b39b9" containerName="extract" Apr 21 04:01:38.955715 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.955681 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb"] Apr 21 04:01:38.955854 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.955799 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:38.958016 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.957980 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 04:01:38.958016 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.958002 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 04:01:38.958206 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.958044 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 21 04:01:38.958206 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:38.958074 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-lb2dx\"" Apr 21 04:01:39.020199 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.020159 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6bl\" (UniqueName: \"kubernetes.io/projected/db432686-e24e-4a6d-8d8e-ea5fd670a4a4-kube-api-access-lq6bl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4phkb\" (UID: \"db432686-e24e-4a6d-8d8e-ea5fd670a4a4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:39.020382 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.020235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db432686-e24e-4a6d-8d8e-ea5fd670a4a4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4phkb\" (UID: \"db432686-e24e-4a6d-8d8e-ea5fd670a4a4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:39.121167 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.121132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db432686-e24e-4a6d-8d8e-ea5fd670a4a4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4phkb\" (UID: \"db432686-e24e-4a6d-8d8e-ea5fd670a4a4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:39.121381 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.121240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6bl\" (UniqueName: \"kubernetes.io/projected/db432686-e24e-4a6d-8d8e-ea5fd670a4a4-kube-api-access-lq6bl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4phkb\" (UID: \"db432686-e24e-4a6d-8d8e-ea5fd670a4a4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:39.123498 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.123467 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db432686-e24e-4a6d-8d8e-ea5fd670a4a4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4phkb\" (UID: \"db432686-e24e-4a6d-8d8e-ea5fd670a4a4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:39.128639 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.128615 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6bl\" (UniqueName: \"kubernetes.io/projected/db432686-e24e-4a6d-8d8e-ea5fd670a4a4-kube-api-access-lq6bl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4phkb\" (UID: \"db432686-e24e-4a6d-8d8e-ea5fd670a4a4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:39.265974 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.265935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:39.389855 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.389829 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb"] Apr 21 04:01:39.393055 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:01:39.393025 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb432686_e24e_4a6d_8d8e_ea5fd670a4a4.slice/crio-b178456c8615da2b071fa6b392ac7bbff5a15f3cc0112372b8dbc1c8d122388e WatchSource:0}: Error finding container b178456c8615da2b071fa6b392ac7bbff5a15f3cc0112372b8dbc1c8d122388e: Status 404 returned error can't find the container with id b178456c8615da2b071fa6b392ac7bbff5a15f3cc0112372b8dbc1c8d122388e Apr 21 04:01:39.880818 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:39.880781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" event={"ID":"db432686-e24e-4a6d-8d8e-ea5fd670a4a4","Type":"ContainerStarted","Data":"b178456c8615da2b071fa6b392ac7bbff5a15f3cc0112372b8dbc1c8d122388e"} Apr 21 04:01:43.562458 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.562426 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-477h2"] Apr 21 04:01:43.588413 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.588379 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-477h2"] Apr 21 04:01:43.588581 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.588518 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.590860 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.590834 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 04:01:43.591009 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.590878 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 21 04:01:43.591009 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.590906 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-mjx4l\"" Apr 21 04:01:43.660589 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.660556 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f9nd\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-kube-api-access-5f9nd\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.660744 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.660599 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.660744 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.660670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b5d6aac7-d55e-47b2-9df5-da64de5313c0-cabundle0\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.761281 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.761248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b5d6aac7-d55e-47b2-9df5-da64de5313c0-cabundle0\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.761489 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.761328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f9nd\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-kube-api-access-5f9nd\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.761489 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.761357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.761612 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:43.761513 2578 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 21 04:01:43.761612 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:43.761530 2578 secret.go:281] references non-existent secret key: ca.crt Apr 21 04:01:43.761612 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:43.761537 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 04:01:43.761612 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:43.761549 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-477h2: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 21 04:01:43.761612 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:43.761599 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates podName:b5d6aac7-d55e-47b2-9df5-da64de5313c0 nodeName:}" failed. No retries permitted until 2026-04-21 04:01:44.261583215 +0000 UTC m=+284.828553707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates") pod "keda-operator-ffbb595cb-477h2" (UID: "b5d6aac7-d55e-47b2-9df5-da64de5313c0") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 21 04:01:43.762009 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.761986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b5d6aac7-d55e-47b2-9df5-da64de5313c0-cabundle0\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.770749 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.770724 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f9nd\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-kube-api-access-5f9nd\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:43.894228 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.894133 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" event={"ID":"db432686-e24e-4a6d-8d8e-ea5fd670a4a4","Type":"ContainerStarted","Data":"93bd830e63854033abd8cbe8dfaa436d1b25c0018de350b5d7b5c160e96cd8aa"} Apr 21 04:01:43.894228 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.894175 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:01:43.912941 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:43.912862 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" podStartSLOduration=2.305047733 podStartE2EDuration="5.912845077s" podCreationTimestamp="2026-04-21 04:01:38 +0000 UTC" firstStartedPulling="2026-04-21 04:01:39.394745845 +0000 UTC m=+279.961716315" lastFinishedPulling="2026-04-21 04:01:43.002543189 +0000 UTC m=+283.569513659" observedRunningTime="2026-04-21 04:01:43.912115138 +0000 UTC m=+284.479085631" watchObservedRunningTime="2026-04-21 04:01:43.912845077 +0000 UTC m=+284.479815570" Apr 21 04:01:44.062411 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.062375 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-dhd5s"] Apr 21 04:01:44.084346 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.084303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-dhd5s"] Apr 21 04:01:44.084507 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.084435 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:44.087861 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.087839 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 21 04:01:44.169413 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.164908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxn5x\" (UniqueName: \"kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-kube-api-access-xxn5x\") pod \"keda-admission-cf49989db-dhd5s\" (UID: \"b9355817-8910-410f-8b45-5b67102cb5ef\") " pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:44.169413 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.165043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-certificates\") pod \"keda-admission-cf49989db-dhd5s\" (UID: \"b9355817-8910-410f-8b45-5b67102cb5ef\") " pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:44.266350 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.266292 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxn5x\" (UniqueName: \"kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-kube-api-access-xxn5x\") pod \"keda-admission-cf49989db-dhd5s\" (UID: \"b9355817-8910-410f-8b45-5b67102cb5ef\") " pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:44.266544 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.266378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:44.266544 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.266447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-certificates\") pod \"keda-admission-cf49989db-dhd5s\" (UID: \"b9355817-8910-410f-8b45-5b67102cb5ef\") " pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:44.266658 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:44.266554 2578 secret.go:281] references non-existent secret key: ca.crt Apr 21 04:01:44.266658 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:44.266578 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 04:01:44.266658 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:44.266594 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-477h2: references non-existent secret key: ca.crt Apr 21 04:01:44.266801 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:44.266665 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates podName:b5d6aac7-d55e-47b2-9df5-da64de5313c0 nodeName:}" failed. No retries permitted until 2026-04-21 04:01:45.266644111 +0000 UTC m=+285.833614588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates") pod "keda-operator-ffbb595cb-477h2" (UID: "b5d6aac7-d55e-47b2-9df5-da64de5313c0") : references non-existent secret key: ca.crt Apr 21 04:01:44.266801 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:44.266592 2578 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 21 04:01:44.266801 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:44.266693 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-dhd5s: secret "keda-admission-webhooks-certs" not found Apr 21 04:01:44.266801 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:44.266772 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-certificates podName:b9355817-8910-410f-8b45-5b67102cb5ef nodeName:}" failed. No retries permitted until 2026-04-21 04:01:44.766754222 +0000 UTC m=+285.333724698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-certificates") pod "keda-admission-cf49989db-dhd5s" (UID: "b9355817-8910-410f-8b45-5b67102cb5ef") : secret "keda-admission-webhooks-certs" not found Apr 21 04:01:44.275642 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.275615 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxn5x\" (UniqueName: \"kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-kube-api-access-xxn5x\") pod \"keda-admission-cf49989db-dhd5s\" (UID: \"b9355817-8910-410f-8b45-5b67102cb5ef\") " pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:44.770933 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.770899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-certificates\") pod \"keda-admission-cf49989db-dhd5s\" (UID: \"b9355817-8910-410f-8b45-5b67102cb5ef\") " pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:44.773460 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.773435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b9355817-8910-410f-8b45-5b67102cb5ef-certificates\") pod \"keda-admission-cf49989db-dhd5s\" (UID: \"b9355817-8910-410f-8b45-5b67102cb5ef\") " pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:44.995303 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:44.994626 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:45.117290 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:45.117198 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-dhd5s"] Apr 21 04:01:45.120753 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:01:45.120726 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9355817_8910_410f_8b45_5b67102cb5ef.slice/crio-9cc0f8098056af6cb0fed6cebde80ba235047a30f57b9922cb729e762a7fedef WatchSource:0}: Error finding container 9cc0f8098056af6cb0fed6cebde80ba235047a30f57b9922cb729e762a7fedef: Status 404 returned error can't find the container with id 9cc0f8098056af6cb0fed6cebde80ba235047a30f57b9922cb729e762a7fedef Apr 21 04:01:45.276426 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:45.276384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:45.276600 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:45.276545 2578 secret.go:281] references non-existent secret key: ca.crt Apr 21 04:01:45.276600 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:45.276563 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 04:01:45.276600 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:45.276573 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-477h2: references non-existent secret key: ca.crt Apr 21 04:01:45.276695 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:01:45.276623 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates podName:b5d6aac7-d55e-47b2-9df5-da64de5313c0 nodeName:}" failed. No retries permitted until 2026-04-21 04:01:47.276609417 +0000 UTC m=+287.843579887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates") pod "keda-operator-ffbb595cb-477h2" (UID: "b5d6aac7-d55e-47b2-9df5-da64de5313c0") : references non-existent secret key: ca.crt Apr 21 04:01:45.903397 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:45.903349 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-dhd5s" event={"ID":"b9355817-8910-410f-8b45-5b67102cb5ef","Type":"ContainerStarted","Data":"9cc0f8098056af6cb0fed6cebde80ba235047a30f57b9922cb729e762a7fedef"} Apr 21 04:01:46.908207 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:46.908167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-dhd5s" event={"ID":"b9355817-8910-410f-8b45-5b67102cb5ef","Type":"ContainerStarted","Data":"ff038b2bed8fc0a1296554c5de14c572c21caa5eff2fb67c774c555397f9ccb5"} Apr 21 04:01:46.908614 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:46.908328 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:01:46.924022 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:46.923978 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-dhd5s" podStartSLOduration=1.6427877880000001 podStartE2EDuration="2.923966063s" podCreationTimestamp="2026-04-21 04:01:44 +0000 UTC" firstStartedPulling="2026-04-21 04:01:45.122399371 +0000 UTC m=+285.689369844" lastFinishedPulling="2026-04-21 04:01:46.403577637 +0000 UTC m=+286.970548119" observedRunningTime="2026-04-21 04:01:46.922067388 +0000 UTC m=+287.489037880" watchObservedRunningTime="2026-04-21 04:01:46.923966063 +0000 UTC m=+287.490936554" Apr 21 04:01:47.293933 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:47.293897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:47.296289 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:47.296261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b5d6aac7-d55e-47b2-9df5-da64de5313c0-certificates\") pod \"keda-operator-ffbb595cb-477h2\" (UID: \"b5d6aac7-d55e-47b2-9df5-da64de5313c0\") " pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:47.499329 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:47.499272 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:47.614844 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:47.614811 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-477h2"] Apr 21 04:01:47.618361 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:01:47.618330 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5d6aac7_d55e_47b2_9df5_da64de5313c0.slice/crio-3ebaa87a98b6d371d9265e6d9b8c4b5cca449bf310de17f377bdc2cedf135475 WatchSource:0}: Error finding container 3ebaa87a98b6d371d9265e6d9b8c4b5cca449bf310de17f377bdc2cedf135475: Status 404 returned error can't find the container with id 3ebaa87a98b6d371d9265e6d9b8c4b5cca449bf310de17f377bdc2cedf135475 Apr 21 04:01:47.912427 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:47.912391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-477h2" event={"ID":"b5d6aac7-d55e-47b2-9df5-da64de5313c0","Type":"ContainerStarted","Data":"3ebaa87a98b6d371d9265e6d9b8c4b5cca449bf310de17f377bdc2cedf135475"} Apr 21 04:01:50.923645 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:50.923607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-477h2" event={"ID":"b5d6aac7-d55e-47b2-9df5-da64de5313c0","Type":"ContainerStarted","Data":"9b6dd97048afd7793e3dee95fa1d8558a909c9402f4d48e74e74876b19b46182"} Apr 21 04:01:50.924010 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:50.923851 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:01:50.939425 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:50.939373 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-477h2" podStartSLOduration=4.919877014 podStartE2EDuration="7.93935712s" podCreationTimestamp="2026-04-21 04:01:43 +0000 UTC" firstStartedPulling="2026-04-21 04:01:47.619714877 +0000 UTC m=+288.186685347" lastFinishedPulling="2026-04-21 04:01:50.639194983 +0000 UTC m=+291.206165453" observedRunningTime="2026-04-21 04:01:50.938352951 +0000 UTC m=+291.505323444" watchObservedRunningTime="2026-04-21 04:01:50.93935712 +0000 UTC m=+291.506327609" Apr 21 04:01:59.799501 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:59.799475 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:01:59.799970 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:59.799483 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:01:59.803187 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:59.803159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:01:59.803361 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:59.803222 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:01:59.811777 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:01:59.811753 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:02:04.900106 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:04.900074 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4phkb" Apr 21 04:02:07.915179 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:07.915147 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-dhd5s" Apr 21 04:02:11.928643 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:11.928611 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-477h2" Apr 21 04:02:49.430078 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.430002 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-bvvqg"] Apr 21 04:02:49.435480 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.435449 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr"] Apr 21 04:02:49.439171 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.439146 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:49.439293 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.439147 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:49.440435 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.440410 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjfvm\" (UniqueName: \"kubernetes.io/projected/5ae4af40-6f16-417f-acde-99f109430e66-kube-api-access-sjfvm\") pod \"kserve-controller-manager-6f655776dd-bvvqg\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:49.440537 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.440451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert\") pod \"kserve-controller-manager-6f655776dd-bvvqg\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:49.442488 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.442465 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 21 04:02:49.442612 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.442502 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 04:02:49.442867 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.442849 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-w762g\"" Apr 21 04:02:49.443176 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.443161 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 04:02:49.443466 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.443450 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 21 04:02:49.443710 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.443546 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-lbghf\"" Apr 21 04:02:49.445543 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.445524 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-bvvqg"] Apr 21 04:02:49.447478 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.447442 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr"] Apr 21 04:02:49.541459 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.541416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert\") pod \"kserve-controller-manager-6f655776dd-bvvqg\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:49.541459 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.541462 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brzfv\" (UniqueName: \"kubernetes.io/projected/9400702e-ae5e-44f2-8573-b66e14baa2f5-kube-api-access-brzfv\") pod \"llmisvc-controller-manager-68cc5db7c4-v5zdr\" (UID: \"9400702e-ae5e-44f2-8573-b66e14baa2f5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:49.541672 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.541515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9400702e-ae5e-44f2-8573-b66e14baa2f5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-v5zdr\" (UID: \"9400702e-ae5e-44f2-8573-b66e14baa2f5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:49.541672 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.541536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjfvm\" (UniqueName: \"kubernetes.io/projected/5ae4af40-6f16-417f-acde-99f109430e66-kube-api-access-sjfvm\") pod \"kserve-controller-manager-6f655776dd-bvvqg\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:49.541672 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:02:49.541552 2578 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 04:02:49.541672 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:02:49.541625 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert podName:5ae4af40-6f16-417f-acde-99f109430e66 nodeName:}" failed. No retries permitted until 2026-04-21 04:02:50.041607311 +0000 UTC m=+350.608577781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert") pod "kserve-controller-manager-6f655776dd-bvvqg" (UID: "5ae4af40-6f16-417f-acde-99f109430e66") : secret "kserve-webhook-server-cert" not found Apr 21 04:02:49.560274 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.560244 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjfvm\" (UniqueName: \"kubernetes.io/projected/5ae4af40-6f16-417f-acde-99f109430e66-kube-api-access-sjfvm\") pod \"kserve-controller-manager-6f655776dd-bvvqg\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:49.642383 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.642348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brzfv\" (UniqueName: \"kubernetes.io/projected/9400702e-ae5e-44f2-8573-b66e14baa2f5-kube-api-access-brzfv\") pod \"llmisvc-controller-manager-68cc5db7c4-v5zdr\" (UID: \"9400702e-ae5e-44f2-8573-b66e14baa2f5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:49.642549 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.642455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9400702e-ae5e-44f2-8573-b66e14baa2f5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-v5zdr\" (UID: \"9400702e-ae5e-44f2-8573-b66e14baa2f5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:49.642611 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:02:49.642592 2578 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 21 04:02:49.642684 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:02:49.642665 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9400702e-ae5e-44f2-8573-b66e14baa2f5-cert podName:9400702e-ae5e-44f2-8573-b66e14baa2f5 nodeName:}" failed. No retries permitted until 2026-04-21 04:02:50.142648636 +0000 UTC m=+350.709619110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9400702e-ae5e-44f2-8573-b66e14baa2f5-cert") pod "llmisvc-controller-manager-68cc5db7c4-v5zdr" (UID: "9400702e-ae5e-44f2-8573-b66e14baa2f5") : secret "llmisvc-webhook-server-cert" not found Apr 21 04:02:49.653270 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:49.653242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brzfv\" (UniqueName: \"kubernetes.io/projected/9400702e-ae5e-44f2-8573-b66e14baa2f5-kube-api-access-brzfv\") pod \"llmisvc-controller-manager-68cc5db7c4-v5zdr\" (UID: \"9400702e-ae5e-44f2-8573-b66e14baa2f5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:50.046350 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.046300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert\") pod \"kserve-controller-manager-6f655776dd-bvvqg\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:50.048701 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.048675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert\") pod \"kserve-controller-manager-6f655776dd-bvvqg\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:50.058671 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.058651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:50.147094 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.147058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9400702e-ae5e-44f2-8573-b66e14baa2f5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-v5zdr\" (UID: \"9400702e-ae5e-44f2-8573-b66e14baa2f5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:50.149564 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.149533 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9400702e-ae5e-44f2-8573-b66e14baa2f5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-v5zdr\" (UID: \"9400702e-ae5e-44f2-8573-b66e14baa2f5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:50.177971 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.177949 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-bvvqg"] Apr 21 04:02:50.180616 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:02:50.180590 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ae4af40_6f16_417f_acde_99f109430e66.slice/crio-7d106cbcc271ffad1c1010f434a9c5d7fed260a41ade90f2dea61941e338544c WatchSource:0}: Error finding container 7d106cbcc271ffad1c1010f434a9c5d7fed260a41ade90f2dea61941e338544c: Status 404 returned error can't find the container with id 7d106cbcc271ffad1c1010f434a9c5d7fed260a41ade90f2dea61941e338544c Apr 21 04:02:50.181749 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.181734 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:02:50.351999 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.351910 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:50.473076 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:50.473047 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr"] Apr 21 04:02:50.475220 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:02:50.475185 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9400702e_ae5e_44f2_8573_b66e14baa2f5.slice/crio-13b81a5611305b9328f8e87583723b37e7dda1fa3bdb9aab07435c7a036e7e7e WatchSource:0}: Error finding container 13b81a5611305b9328f8e87583723b37e7dda1fa3bdb9aab07435c7a036e7e7e: Status 404 returned error can't find the container with id 13b81a5611305b9328f8e87583723b37e7dda1fa3bdb9aab07435c7a036e7e7e Apr 21 04:02:51.126760 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:51.126719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" event={"ID":"9400702e-ae5e-44f2-8573-b66e14baa2f5","Type":"ContainerStarted","Data":"13b81a5611305b9328f8e87583723b37e7dda1fa3bdb9aab07435c7a036e7e7e"} Apr 21 04:02:51.128275 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:51.128229 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" event={"ID":"5ae4af40-6f16-417f-acde-99f109430e66","Type":"ContainerStarted","Data":"7d106cbcc271ffad1c1010f434a9c5d7fed260a41ade90f2dea61941e338544c"} Apr 21 04:02:54.139239 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:54.139207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" event={"ID":"9400702e-ae5e-44f2-8573-b66e14baa2f5","Type":"ContainerStarted","Data":"36161a41924193ba5f1453f853da4881196477038c8c838a3ec8597d72436e07"} Apr 21 04:02:54.139711 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:54.139363 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:02:54.140655 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:54.140630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" event={"ID":"5ae4af40-6f16-417f-acde-99f109430e66","Type":"ContainerStarted","Data":"be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c"} Apr 21 04:02:54.140760 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:54.140708 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:02:54.153352 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:54.153293 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" podStartSLOduration=2.330040859 podStartE2EDuration="5.153272485s" podCreationTimestamp="2026-04-21 04:02:49 +0000 UTC" firstStartedPulling="2026-04-21 04:02:50.476562271 +0000 UTC m=+351.043532740" lastFinishedPulling="2026-04-21 04:02:53.299793681 +0000 UTC m=+353.866764366" observedRunningTime="2026-04-21 04:02:54.152348476 +0000 UTC m=+354.719318969" watchObservedRunningTime="2026-04-21 04:02:54.153272485 +0000 UTC m=+354.720242978" Apr 21 04:02:54.171904 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:02:54.171849 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" podStartSLOduration=2.048744083 podStartE2EDuration="5.171832188s" podCreationTimestamp="2026-04-21 04:02:49 +0000 UTC" firstStartedPulling="2026-04-21 04:02:50.181854408 +0000 UTC m=+350.748824878" lastFinishedPulling="2026-04-21 04:02:53.304942512 +0000 UTC m=+353.871912983" observedRunningTime="2026-04-21 04:02:54.165971909 +0000 UTC m=+354.732942403" watchObservedRunningTime="2026-04-21 04:02:54.171832188 +0000 UTC m=+354.738802680" Apr 21 04:03:25.146174 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:25.146134 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-v5zdr" Apr 21 04:03:25.149234 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:25.149204 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:03:26.377721 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.377683 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-bvvqg"] Apr 21 04:03:26.378090 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.377921 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" podUID="5ae4af40-6f16-417f-acde-99f109430e66" containerName="manager" containerID="cri-o://be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c" gracePeriod=10 Apr 21 04:03:26.403640 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.403608 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-sqslk"] Apr 21 04:03:26.406830 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.406814 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:26.416200 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.416159 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-sqslk"] Apr 21 04:03:26.569815 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.569782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a943ca0-c43a-4964-9dc7-04f018e86352-cert\") pod \"kserve-controller-manager-6f655776dd-sqslk\" (UID: \"2a943ca0-c43a-4964-9dc7-04f018e86352\") " pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:26.569985 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.569908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjl98\" (UniqueName: \"kubernetes.io/projected/2a943ca0-c43a-4964-9dc7-04f018e86352-kube-api-access-rjl98\") pod \"kserve-controller-manager-6f655776dd-sqslk\" (UID: \"2a943ca0-c43a-4964-9dc7-04f018e86352\") " pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:26.607216 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.607192 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:03:26.670448 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.670412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjl98\" (UniqueName: \"kubernetes.io/projected/2a943ca0-c43a-4964-9dc7-04f018e86352-kube-api-access-rjl98\") pod \"kserve-controller-manager-6f655776dd-sqslk\" (UID: \"2a943ca0-c43a-4964-9dc7-04f018e86352\") " pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:26.670612 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.670480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a943ca0-c43a-4964-9dc7-04f018e86352-cert\") pod \"kserve-controller-manager-6f655776dd-sqslk\" (UID: \"2a943ca0-c43a-4964-9dc7-04f018e86352\") " pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:26.672940 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.672916 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a943ca0-c43a-4964-9dc7-04f018e86352-cert\") pod \"kserve-controller-manager-6f655776dd-sqslk\" (UID: \"2a943ca0-c43a-4964-9dc7-04f018e86352\") " pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:26.678243 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.678223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjl98\" (UniqueName: \"kubernetes.io/projected/2a943ca0-c43a-4964-9dc7-04f018e86352-kube-api-access-rjl98\") pod \"kserve-controller-manager-6f655776dd-sqslk\" (UID: \"2a943ca0-c43a-4964-9dc7-04f018e86352\") " pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:26.759808 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.759766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:26.771294 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.771266 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert\") pod \"5ae4af40-6f16-417f-acde-99f109430e66\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " Apr 21 04:03:26.771435 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.771323 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjfvm\" (UniqueName: \"kubernetes.io/projected/5ae4af40-6f16-417f-acde-99f109430e66-kube-api-access-sjfvm\") pod \"5ae4af40-6f16-417f-acde-99f109430e66\" (UID: \"5ae4af40-6f16-417f-acde-99f109430e66\") " Apr 21 04:03:26.773520 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.773492 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae4af40-6f16-417f-acde-99f109430e66-kube-api-access-sjfvm" (OuterVolumeSpecName: "kube-api-access-sjfvm") pod "5ae4af40-6f16-417f-acde-99f109430e66" (UID: "5ae4af40-6f16-417f-acde-99f109430e66"). InnerVolumeSpecName "kube-api-access-sjfvm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:03:26.773625 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.773496 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert" (OuterVolumeSpecName: "cert") pod "5ae4af40-6f16-417f-acde-99f109430e66" (UID: "5ae4af40-6f16-417f-acde-99f109430e66"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:03:26.872887 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.872853 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ae4af40-6f16-417f-acde-99f109430e66-cert\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:03:26.872887 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.872883 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sjfvm\" (UniqueName: \"kubernetes.io/projected/5ae4af40-6f16-417f-acde-99f109430e66-kube-api-access-sjfvm\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:03:26.875886 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:26.875851 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-sqslk"] Apr 21 04:03:26.878195 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:03:26.878169 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a943ca0_c43a_4964_9dc7_04f018e86352.slice/crio-59095e6a2c9eebb84ce75bb8db2155c86cf974c940b625be5a210b3dc67bfe51 WatchSource:0}: Error finding container 59095e6a2c9eebb84ce75bb8db2155c86cf974c940b625be5a210b3dc67bfe51: Status 404 returned error can't find the container with id 59095e6a2c9eebb84ce75bb8db2155c86cf974c940b625be5a210b3dc67bfe51 Apr 21 04:03:27.249538 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.249512 2578 generic.go:358] "Generic (PLEG): container finished" podID="5ae4af40-6f16-417f-acde-99f109430e66" containerID="be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c" exitCode=0 Apr 21 04:03:27.249634 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.249572 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" Apr 21 04:03:27.249634 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.249598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" event={"ID":"5ae4af40-6f16-417f-acde-99f109430e66","Type":"ContainerDied","Data":"be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c"} Apr 21 04:03:27.249749 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.249642 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-bvvqg" event={"ID":"5ae4af40-6f16-417f-acde-99f109430e66","Type":"ContainerDied","Data":"7d106cbcc271ffad1c1010f434a9c5d7fed260a41ade90f2dea61941e338544c"} Apr 21 04:03:27.249749 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.249666 2578 scope.go:117] "RemoveContainer" containerID="be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c" Apr 21 04:03:27.250624 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.250582 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-sqslk" event={"ID":"2a943ca0-c43a-4964-9dc7-04f018e86352","Type":"ContainerStarted","Data":"59095e6a2c9eebb84ce75bb8db2155c86cf974c940b625be5a210b3dc67bfe51"} Apr 21 04:03:27.259880 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.259851 2578 scope.go:117] "RemoveContainer" containerID="be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c" Apr 21 04:03:27.260126 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:03:27.260108 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c\": container with ID starting with be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c not found: ID does not exist" containerID="be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c" Apr 21 04:03:27.260209 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.260132 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c"} err="failed to get container status \"be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c\": rpc error: code = NotFound desc = could not find container \"be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c\": container with ID starting with be68417cbb721780e276aa31e8e9500bb8bdac50bc85391c99aa0c4438aa790c not found: ID does not exist" Apr 21 04:03:27.269271 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.269246 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-bvvqg"] Apr 21 04:03:27.271762 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.271743 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-bvvqg"] Apr 21 04:03:27.881720 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:27.881686 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae4af40-6f16-417f-acde-99f109430e66" path="/var/lib/kubelet/pods/5ae4af40-6f16-417f-acde-99f109430e66/volumes" Apr 21 04:03:28.255567 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:28.255531 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-sqslk" event={"ID":"2a943ca0-c43a-4964-9dc7-04f018e86352","Type":"ContainerStarted","Data":"a77d20f8ebb853a34f5cc8e55f87c19c96f6bf6e1e329759f9101a5190aacab0"} Apr 21 04:03:28.255738 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:28.255639 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:03:28.270492 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:28.270441 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-sqslk" podStartSLOduration=1.925812148 podStartE2EDuration="2.270428431s" podCreationTimestamp="2026-04-21 04:03:26 +0000 UTC" firstStartedPulling="2026-04-21 04:03:26.879445294 +0000 UTC m=+387.446415764" lastFinishedPulling="2026-04-21 04:03:27.224061577 +0000 UTC m=+387.791032047" observedRunningTime="2026-04-21 04:03:28.268867069 +0000 UTC m=+388.835837562" watchObservedRunningTime="2026-04-21 04:03:28.270428431 +0000 UTC m=+388.837398923" Apr 21 04:03:53.895177 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:53.895143 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-694658b7c-bdbj9"] Apr 21 04:03:53.895603 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:53.895583 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ae4af40-6f16-417f-acde-99f109430e66" containerName="manager" Apr 21 04:03:53.895650 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:53.895607 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae4af40-6f16-417f-acde-99f109430e66" containerName="manager" Apr 21 04:03:53.895693 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:53.895685 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ae4af40-6f16-417f-acde-99f109430e66" containerName="manager" Apr 21 04:03:53.898885 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:53.898864 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:53.913266 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:53.913238 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-694658b7c-bdbj9"] Apr 21 04:03:54.006122 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.006088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-trusted-ca-bundle\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.006122 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.006121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-console-config\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.006364 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.006138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zm8\" (UniqueName: \"kubernetes.io/projected/389ead3c-0505-4cf2-8434-6460bf647484-kube-api-access-k7zm8\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.006364 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.006189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/389ead3c-0505-4cf2-8434-6460bf647484-console-serving-cert\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.006364 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.006245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/389ead3c-0505-4cf2-8434-6460bf647484-console-oauth-config\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.006364 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.006344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-oauth-serving-cert\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.006587 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.006367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-service-ca\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107041 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107008 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-oauth-serving-cert\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107041 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-service-ca\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107295 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107076 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-trusted-ca-bundle\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107295 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-console-config\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107295 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7zm8\" (UniqueName: \"kubernetes.io/projected/389ead3c-0505-4cf2-8434-6460bf647484-kube-api-access-k7zm8\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107485 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107303 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/389ead3c-0505-4cf2-8434-6460bf647484-console-serving-cert\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107485 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/389ead3c-0505-4cf2-8434-6460bf647484-console-oauth-config\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107855 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-oauth-serving-cert\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.107855 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-service-ca\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.108022 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107917 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-console-config\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.108022 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.107939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/389ead3c-0505-4cf2-8434-6460bf647484-trusted-ca-bundle\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.109833 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.109815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/389ead3c-0505-4cf2-8434-6460bf647484-console-oauth-config\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.109945 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.109895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/389ead3c-0505-4cf2-8434-6460bf647484-console-serving-cert\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.114438 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.114418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7zm8\" (UniqueName: \"kubernetes.io/projected/389ead3c-0505-4cf2-8434-6460bf647484-kube-api-access-k7zm8\") pod \"console-694658b7c-bdbj9\" (UID: \"389ead3c-0505-4cf2-8434-6460bf647484\") " pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.208929 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.208884 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:03:54.335635 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.335569 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-694658b7c-bdbj9"] Apr 21 04:03:54.337834 ip-10-0-131-182 kubenswrapper[2578]: W0421 04:03:54.337810 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389ead3c_0505_4cf2_8434_6460bf647484.slice/crio-7112268d07c95f1cf996e3a11e77702fd72ed4bae9b367f0f38355506fa6abe9 WatchSource:0}: Error finding container 7112268d07c95f1cf996e3a11e77702fd72ed4bae9b367f0f38355506fa6abe9: Status 404 returned error can't find the container with id 7112268d07c95f1cf996e3a11e77702fd72ed4bae9b367f0f38355506fa6abe9 Apr 21 04:03:54.343984 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:54.343956 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-694658b7c-bdbj9" event={"ID":"389ead3c-0505-4cf2-8434-6460bf647484","Type":"ContainerStarted","Data":"7112268d07c95f1cf996e3a11e77702fd72ed4bae9b367f0f38355506fa6abe9"} Apr 21 04:03:55.349563 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:55.349531 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-694658b7c-bdbj9" event={"ID":"389ead3c-0505-4cf2-8434-6460bf647484","Type":"ContainerStarted","Data":"7ea23c2d9ed96827491b3fb244c5e758b0ed6b5c7b02b8ad9c9702fd91485828"} Apr 21 04:03:55.367675 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:55.367624 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-694658b7c-bdbj9" podStartSLOduration=2.367610826 podStartE2EDuration="2.367610826s" podCreationTimestamp="2026-04-21 04:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:03:55.36507022 +0000 UTC m=+415.932040722" watchObservedRunningTime="2026-04-21 04:03:55.367610826 +0000 UTC m=+415.934581318" Apr 21 04:03:59.265525 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:03:59.265491 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-sqslk" Apr 21 04:04:04.209820 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:04.209762 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:04:04.209820 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:04.209828 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:04:04.214692 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:04.214668 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:04:04.385962 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:04.385933 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-694658b7c-bdbj9" Apr 21 04:04:04.431416 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:04.431377 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-686976b7d5-7c8v6"] Apr 21 04:04:29.451145 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.451038 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-686976b7d5-7c8v6" podUID="7741d138-b730-4f49-987e-75d6648f19f6" containerName="console" containerID="cri-o://1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f" gracePeriod=15 Apr 21 04:04:29.691729 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.691705 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-686976b7d5-7c8v6_7741d138-b730-4f49-987e-75d6648f19f6/console/0.log" Apr 21 04:04:29.691878 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.691765 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:04:29.709182 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.708720 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-trusted-ca-bundle\") pod \"7741d138-b730-4f49-987e-75d6648f19f6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " Apr 21 04:04:29.709182 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.708776 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-oauth-serving-cert\") pod \"7741d138-b730-4f49-987e-75d6648f19f6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " Apr 21 04:04:29.709182 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.708882 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dhff\" (UniqueName: \"kubernetes.io/projected/7741d138-b730-4f49-987e-75d6648f19f6-kube-api-access-4dhff\") pod \"7741d138-b730-4f49-987e-75d6648f19f6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " Apr 21 04:04:29.709182 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.708910 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-oauth-config\") pod \"7741d138-b730-4f49-987e-75d6648f19f6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " Apr 21 04:04:29.709182 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.708958 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-serving-cert\") pod \"7741d138-b730-4f49-987e-75d6648f19f6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " Apr 21 04:04:29.709182 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.709032 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-console-config\") pod \"7741d138-b730-4f49-987e-75d6648f19f6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " Apr 21 04:04:29.709182 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.709061 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-service-ca\") pod \"7741d138-b730-4f49-987e-75d6648f19f6\" (UID: \"7741d138-b730-4f49-987e-75d6648f19f6\") " Apr 21 04:04:29.709988 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.709934 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-service-ca" (OuterVolumeSpecName: "service-ca") pod "7741d138-b730-4f49-987e-75d6648f19f6" (UID: "7741d138-b730-4f49-987e-75d6648f19f6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:04:29.710215 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.710187 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7741d138-b730-4f49-987e-75d6648f19f6" (UID: "7741d138-b730-4f49-987e-75d6648f19f6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:04:29.710346 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.710194 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7741d138-b730-4f49-987e-75d6648f19f6" (UID: "7741d138-b730-4f49-987e-75d6648f19f6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:04:29.710346 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.710233 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-console-config" (OuterVolumeSpecName: "console-config") pod "7741d138-b730-4f49-987e-75d6648f19f6" (UID: "7741d138-b730-4f49-987e-75d6648f19f6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:04:29.712284 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.712250 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7741d138-b730-4f49-987e-75d6648f19f6-kube-api-access-4dhff" (OuterVolumeSpecName: "kube-api-access-4dhff") pod "7741d138-b730-4f49-987e-75d6648f19f6" (UID: "7741d138-b730-4f49-987e-75d6648f19f6"). InnerVolumeSpecName "kube-api-access-4dhff". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:04:29.712416 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.712370 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7741d138-b730-4f49-987e-75d6648f19f6" (UID: "7741d138-b730-4f49-987e-75d6648f19f6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:04:29.712487 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.712469 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7741d138-b730-4f49-987e-75d6648f19f6" (UID: "7741d138-b730-4f49-987e-75d6648f19f6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:04:29.810831 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.810789 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4dhff\" (UniqueName: \"kubernetes.io/projected/7741d138-b730-4f49-987e-75d6648f19f6-kube-api-access-4dhff\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:04:29.810831 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.810825 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-oauth-config\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:04:29.810831 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.810835 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7741d138-b730-4f49-987e-75d6648f19f6-console-serving-cert\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:04:29.811057 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.810844 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-console-config\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:04:29.811057 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.810854 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-service-ca\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:04:29.811057 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.810862 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-trusted-ca-bundle\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:04:29.811057 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:29.810871 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7741d138-b730-4f49-987e-75d6648f19f6-oauth-serving-cert\") on node \"ip-10-0-131-182.ec2.internal\" DevicePath \"\"" Apr 21 04:04:30.469250 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.469222 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-686976b7d5-7c8v6_7741d138-b730-4f49-987e-75d6648f19f6/console/0.log" Apr 21 04:04:30.469845 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.469262 2578 generic.go:358] "Generic (PLEG): container finished" podID="7741d138-b730-4f49-987e-75d6648f19f6" containerID="1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f" exitCode=2 Apr 21 04:04:30.469845 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.469290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-686976b7d5-7c8v6" event={"ID":"7741d138-b730-4f49-987e-75d6648f19f6","Type":"ContainerDied","Data":"1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f"} Apr 21 04:04:30.469845 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.469347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-686976b7d5-7c8v6" event={"ID":"7741d138-b730-4f49-987e-75d6648f19f6","Type":"ContainerDied","Data":"3db1a3c2cef59f403c6adaededc7f8d8a7d4f8c0cffc7066ac56e1f1fad5eae3"} Apr 21 04:04:30.469845 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.469363 2578 scope.go:117] "RemoveContainer" containerID="1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f" Apr 21 04:04:30.469845 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.469383 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-686976b7d5-7c8v6" Apr 21 04:04:30.477735 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.477717 2578 scope.go:117] "RemoveContainer" containerID="1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f" Apr 21 04:04:30.477974 ip-10-0-131-182 kubenswrapper[2578]: E0421 04:04:30.477955 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f\": container with ID starting with 1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f not found: ID does not exist" containerID="1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f" Apr 21 04:04:30.478026 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.477982 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f"} err="failed to get container status \"1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f\": rpc error: code = NotFound desc = could not find container \"1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f\": container with ID starting with 1001708593afaa0f4bb01cad3f36004fddbc659f74727b99e23164ecb4bbc01f not found: ID does not exist" Apr 21 04:04:30.489108 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.489081 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-686976b7d5-7c8v6"] Apr 21 04:04:30.496827 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:30.496802 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-686976b7d5-7c8v6"] Apr 21 04:04:31.877235 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:04:31.877199 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7741d138-b730-4f49-987e-75d6648f19f6" path="/var/lib/kubelet/pods/7741d138-b730-4f49-987e-75d6648f19f6/volumes" Apr 21 04:06:59.830403 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:06:59.830372 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:06:59.836121 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:06:59.836096 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:06:59.836454 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:06:59.836437 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:06:59.839651 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:06:59.839632 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:11:59.855895 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:11:59.855815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:11:59.859489 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:11:59.859461 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:11:59.861684 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:11:59.861662 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:11:59.864858 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:11:59.864835 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:16:59.881459 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:16:59.881433 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:16:59.884890 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:16:59.884868 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:16:59.888457 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:16:59.888436 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:16:59.891625 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:16:59.891608 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:21:59.905431 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:21:59.905400 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:21:59.908702 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:21:59.908681 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:21:59.914344 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:21:59.914326 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:21:59.917717 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:21:59.917697 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:26:59.929277 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:26:59.929244 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:26:59.932819 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:26:59.932788 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:26:59.940483 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:26:59.940458 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:26:59.943954 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:26:59.943934 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:31:59.954141 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:31:59.954114 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:31:59.957407 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:31:59.957380 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:31:59.972235 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:31:59.972206 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:31:59.975654 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:31:59.975632 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:36:59.977538 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:36:59.977506 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:36:59.980416 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:36:59.980394 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:36:59.996722 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:36:59.996695 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:36:59.999979 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:36:59.999958 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:42:00.000467 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:42:00.000438 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:42:00.003515 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:42:00.003494 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:42:00.024893 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:42:00.024862 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:42:00.028103 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:42:00.028082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:47:00.023928 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:47:00.023902 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:47:00.027208 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:47:00.027182 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:47:00.048405 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:47:00.048384 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:47:00.051571 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:47:00.051552 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:52:00.047414 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:52:00.047291 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:52:00.051521 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:52:00.050458 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:52:00.074203 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:52:00.074171 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:52:00.077666 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:52:00.077644 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:57:00.071017 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:57:00.070910 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:57:00.076027 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:57:00.074182 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 04:57:00.107990 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:57:00.107963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 04:57:00.111456 ip-10-0-131-182 kubenswrapper[2578]: I0421 04:57:00.111435 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 05:00:37.737722 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:37.737678 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lqtd7_8ba63635-96d8-482b-a6ee-d309369ecee1/global-pull-secret-syncer/0.log" Apr 21 05:00:37.906134 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:37.906098 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kqnd2_a68bd18b-f831-4da9-b0f4-303529c0c0ca/konnectivity-agent/0.log" Apr 21 05:00:37.950952 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:37.950925 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-182.ec2.internal_153f8e46061c7f7dc78983197c922203/haproxy/0.log" Apr 21 05:00:41.277568 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.277541 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd041aa4-3684-4ba0-ac16-42f892202ca4/alertmanager/0.log" Apr 21 05:00:41.301375 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.301338 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd041aa4-3684-4ba0-ac16-42f892202ca4/config-reloader/0.log" Apr 21 05:00:41.323329 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.323279 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd041aa4-3684-4ba0-ac16-42f892202ca4/kube-rbac-proxy-web/0.log" Apr 21 05:00:41.345304 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.345242 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd041aa4-3684-4ba0-ac16-42f892202ca4/kube-rbac-proxy/0.log" Apr 21 05:00:41.366642 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.366619 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd041aa4-3684-4ba0-ac16-42f892202ca4/kube-rbac-proxy-metric/0.log" Apr 21 05:00:41.388848 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.388818 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd041aa4-3684-4ba0-ac16-42f892202ca4/prom-label-proxy/0.log" Apr 21 05:00:41.409224 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.409199 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fd041aa4-3684-4ba0-ac16-42f892202ca4/init-config-reloader/0.log" Apr 21 05:00:41.451083 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.451049 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-rf7f7_53593715-fad9-4f5d-8bfa-5579ca4bfd14/cluster-monitoring-operator/0.log" Apr 21 05:00:41.685677 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.685647 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vbqg8_63d4e126-93be-4cd9-9f82-de3809f011a9/node-exporter/0.log" Apr 21 05:00:41.707861 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.707833 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vbqg8_63d4e126-93be-4cd9-9f82-de3809f011a9/kube-rbac-proxy/0.log" Apr 21 05:00:41.728384 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:41.728358 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vbqg8_63d4e126-93be-4cd9-9f82-de3809f011a9/init-textfile/0.log" Apr 21 05:00:42.074068 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.073990 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-mh68z_74058be4-ad34-4bb3-a9f8-1a70c3b056f7/prometheus-operator/0.log" Apr 21 05:00:42.096352 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.096324 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-mh68z_74058be4-ad34-4bb3-a9f8-1a70c3b056f7/kube-rbac-proxy/0.log" Apr 21 05:00:42.119515 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.119485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-zfnns_64ed9a8a-2860-46ac-998c-2512e17b4ce8/prometheus-operator-admission-webhook/0.log" Apr 21 05:00:42.146663 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.146629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b5ff8988c-fdvmf_7d9538ae-470d-46c0-b07e-62b8c7686666/telemeter-client/0.log" Apr 21 05:00:42.170094 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.170069 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b5ff8988c-fdvmf_7d9538ae-470d-46c0-b07e-62b8c7686666/reload/0.log" Apr 21 05:00:42.193274 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.193250 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b5ff8988c-fdvmf_7d9538ae-470d-46c0-b07e-62b8c7686666/kube-rbac-proxy/0.log" Apr 21 05:00:42.221731 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.221699 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74cb4966bb-7r772_43d0e693-7ea1-400b-a779-cb496fc9bf3f/thanos-query/0.log" Apr 21 05:00:42.245355 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.245305 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74cb4966bb-7r772_43d0e693-7ea1-400b-a779-cb496fc9bf3f/kube-rbac-proxy-web/0.log" Apr 21 05:00:42.267624 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.267593 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74cb4966bb-7r772_43d0e693-7ea1-400b-a779-cb496fc9bf3f/kube-rbac-proxy/0.log" Apr 21 05:00:42.289009 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.288979 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74cb4966bb-7r772_43d0e693-7ea1-400b-a779-cb496fc9bf3f/prom-label-proxy/0.log" Apr 21 05:00:42.312425 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.312396 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74cb4966bb-7r772_43d0e693-7ea1-400b-a779-cb496fc9bf3f/kube-rbac-proxy-rules/0.log" Apr 21 05:00:42.334580 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:42.334498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74cb4966bb-7r772_43d0e693-7ea1-400b-a779-cb496fc9bf3f/kube-rbac-proxy-metrics/0.log" Apr 21 05:00:43.442103 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:43.442070 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-dg8bb_2846597e-4516-4d1d-9e48-8d7c984b548c/networking-console-plugin/0.log" Apr 21 05:00:43.865300 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:43.865225 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/2.log" Apr 21 05:00:43.869179 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:43.869159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5vjmh_d67a7c8e-4570-49e2-a6b2-fb6ceeba5a69/console-operator/3.log" Apr 21 05:00:44.252702 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.252672 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-694658b7c-bdbj9_389ead3c-0505-4cf2-8434-6460bf647484/console/0.log" Apr 21 05:00:44.728136 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.728104 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-zl47f_f065d82f-468a-4343-b62c-5c000b2c9ad2/volume-data-source-validator/0.log" Apr 21 05:00:44.893469 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.893437 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt"] Apr 21 05:00:44.893890 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.893874 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7741d138-b730-4f49-987e-75d6648f19f6" containerName="console" Apr 21 05:00:44.893890 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.893890 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7741d138-b730-4f49-987e-75d6648f19f6" containerName="console" Apr 21 05:00:44.894031 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.893987 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7741d138-b730-4f49-987e-75d6648f19f6" containerName="console" Apr 21 05:00:44.896944 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.896923 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:44.898764 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.898745 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vqvxw\"/\"kube-root-ca.crt\"" Apr 21 05:00:44.899361 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.899335 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vqvxw\"/\"openshift-service-ca.crt\"" Apr 21 05:00:44.899361 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.899336 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vqvxw\"/\"default-dockercfg-vstpr\"" Apr 21 05:00:44.906165 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.906138 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt"] Apr 21 05:00:44.992957 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.992869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-proc\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:44.992957 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.992917 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-podres\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:44.993154 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.992992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-sys\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:44.993154 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.993048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-lib-modules\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:44.993154 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:44.993066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8s95\" (UniqueName: \"kubernetes.io/projected/dda505ea-6791-48b0-ab7b-688978e1d149-kube-api-access-x8s95\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.093755 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-podres\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.093937 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-sys\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.093937 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-lib-modules\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.093937 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8s95\" (UniqueName: \"kubernetes.io/projected/dda505ea-6791-48b0-ab7b-688978e1d149-kube-api-access-x8s95\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.093937 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-sys\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.093937 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-proc\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.093937 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093874 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-podres\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.093937 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-proc\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.094193 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.093962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dda505ea-6791-48b0-ab7b-688978e1d149-lib-modules\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.100999 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.100975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8s95\" (UniqueName: \"kubernetes.io/projected/dda505ea-6791-48b0-ab7b-688978e1d149-kube-api-access-x8s95\") pod \"perf-node-gather-daemonset-hltkt\" (UID: \"dda505ea-6791-48b0-ab7b-688978e1d149\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.208726 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.208695 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.332267 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.332234 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt"] Apr 21 05:00:45.335578 ip-10-0-131-182 kubenswrapper[2578]: W0421 05:00:45.335547 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddda505ea_6791_48b0_ab7b_688978e1d149.slice/crio-ddc160ecb30225763ad4500bc6b4f8ec9c698fcb63966b29e158b603d018e67c WatchSource:0}: Error finding container ddc160ecb30225763ad4500bc6b4f8ec9c698fcb63966b29e158b603d018e67c: Status 404 returned error can't find the container with id ddc160ecb30225763ad4500bc6b4f8ec9c698fcb63966b29e158b603d018e67c Apr 21 05:00:45.337154 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.337138 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 05:00:45.442462 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.442426 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7mb9p_86dacc78-49b6-4d77-b33d-e5f6f827d63e/dns/0.log" Apr 21 05:00:45.464204 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.464174 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7mb9p_86dacc78-49b6-4d77-b33d-e5f6f827d63e/kube-rbac-proxy/0.log" Apr 21 05:00:45.551393 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.551288 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m7pww_48171444-16ad-44d3-adcd-dbc651bb6b7e/dns-node-resolver/0.log" Apr 21 05:00:45.864964 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.864876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" event={"ID":"dda505ea-6791-48b0-ab7b-688978e1d149","Type":"ContainerStarted","Data":"204b6eb8f0fbac69d9579b7643fbee39792314677a3c5bb058acabe5d7143c1b"} Apr 21 05:00:45.864964 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.864920 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" event={"ID":"dda505ea-6791-48b0-ab7b-688978e1d149","Type":"ContainerStarted","Data":"ddc160ecb30225763ad4500bc6b4f8ec9c698fcb63966b29e158b603d018e67c"} Apr 21 05:00:45.865389 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.864964 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:45.880003 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.879955 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" podStartSLOduration=1.879938028 podStartE2EDuration="1.879938028s" podCreationTimestamp="2026-04-21 05:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 05:00:45.878079919 +0000 UTC m=+3826.445050411" watchObservedRunningTime="2026-04-21 05:00:45.879938028 +0000 UTC m=+3826.446908519" Apr 21 05:00:45.982171 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:45.982134 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-67d8c4d679-5fkvs_ac968620-3704-45c1-9775-d96d60659cc1/registry/0.log" Apr 21 05:00:46.003154 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:46.003120 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9j76c_4b9475f5-b0ae-474d-b51c-d5a1efe76899/node-ca/0.log" Apr 21 05:00:47.131697 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:47.131667 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zbrz9_c3286e93-02a3-4094-a61d-5b8ba11a35d6/serve-healthcheck-canary/0.log" Apr 21 05:00:47.487904 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:47.487859 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-s5w6b_b31507b9-91ed-4a27-ad54-be88b2865602/insights-operator/0.log" Apr 21 05:00:47.489390 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:47.489368 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-s5w6b_b31507b9-91ed-4a27-ad54-be88b2865602/insights-operator/1.log" Apr 21 05:00:47.508397 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:47.508370 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4sndf_357eea12-bfee-4163-a496-c41dc3e15906/kube-rbac-proxy/0.log" Apr 21 05:00:47.529376 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:47.529350 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4sndf_357eea12-bfee-4163-a496-c41dc3e15906/exporter/0.log" Apr 21 05:00:47.550896 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:47.550861 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4sndf_357eea12-bfee-4163-a496-c41dc3e15906/extractor/0.log" Apr 21 05:00:49.602777 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:49.602744 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6f655776dd-sqslk_2a943ca0-c43a-4964-9dc7-04f018e86352/manager/0.log" Apr 21 05:00:49.622652 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:49.622629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-v5zdr_9400702e-ae5e-44f2-8573-b66e14baa2f5/manager/0.log" Apr 21 05:00:51.878823 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:51.878796 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-hltkt" Apr 21 05:00:54.188689 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:54.188655 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2nv4r_bdaaadc4-2dd1-4be8-955c-755deb5df200/kube-storage-version-migrator-operator/1.log" Apr 21 05:00:54.189463 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:54.189440 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-2nv4r_bdaaadc4-2dd1-4be8-955c-755deb5df200/kube-storage-version-migrator-operator/0.log" Apr 21 05:00:55.132368 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.132339 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8v8nb_db466d44-e778-45ee-935c-04ea6c071763/kube-multus-additional-cni-plugins/0.log" Apr 21 05:00:55.154864 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.154833 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8v8nb_db466d44-e778-45ee-935c-04ea6c071763/egress-router-binary-copy/0.log" Apr 21 05:00:55.178486 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.178453 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8v8nb_db466d44-e778-45ee-935c-04ea6c071763/cni-plugins/0.log" Apr 21 05:00:55.199710 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.199685 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8v8nb_db466d44-e778-45ee-935c-04ea6c071763/bond-cni-plugin/0.log" Apr 21 05:00:55.220353 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.220302 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8v8nb_db466d44-e778-45ee-935c-04ea6c071763/routeoverride-cni/0.log" Apr 21 05:00:55.240621 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.240595 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8v8nb_db466d44-e778-45ee-935c-04ea6c071763/whereabouts-cni-bincopy/0.log" Apr 21 05:00:55.260906 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.260881 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8v8nb_db466d44-e778-45ee-935c-04ea6c071763/whereabouts-cni/0.log" Apr 21 05:00:55.595832 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.595801 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdjrx_7ebaea07-99ce-462a-8ed5-d99c06d6417e/kube-multus/0.log" Apr 21 05:00:55.702587 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.702513 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wcnkn_88564462-797f-416f-b56b-0e31e0156815/network-metrics-daemon/0.log" Apr 21 05:00:55.726347 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:55.726303 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wcnkn_88564462-797f-416f-b56b-0e31e0156815/kube-rbac-proxy/0.log" Apr 21 05:00:56.505019 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.504992 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-controller/0.log" Apr 21 05:00:56.525997 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.525975 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/0.log" Apr 21 05:00:56.542582 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.542562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovn-acl-logging/1.log" Apr 21 05:00:56.562571 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.562550 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/kube-rbac-proxy-node/0.log" Apr 21 05:00:56.583282 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.583246 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 05:00:56.603517 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.603496 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/northd/0.log" Apr 21 05:00:56.625559 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.625537 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/nbdb/0.log" Apr 21 05:00:56.647455 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.647431 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/sbdb/0.log" Apr 21 05:00:56.747016 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:56.746985 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ssvt_2d01920e-91d4-4f4c-b55b-21442ffc88c5/ovnkube-controller/0.log" Apr 21 05:00:58.369696 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:58.369660 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sql8b_e498c7e8-3ee2-49ce-8ccf-9a86e869f003/network-check-target-container/0.log" Apr 21 05:00:59.231832 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:59.231803 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-64s86_29783e2f-7e33-4e0b-a972-c774556775ce/iptables-alerter/0.log" Apr 21 05:00:59.902497 ip-10-0-131-182 kubenswrapper[2578]: I0421 05:00:59.902452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ng54w_9a8b2ae1-8e68-4c2d-a316-6fc7547a3812/tuned/0.log"