Apr 21 06:23:58.233351 ip-10-0-129-55 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 06:23:58.233363 ip-10-0-129-55 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 06:23:58.233370 ip-10-0-129-55 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 06:23:58.233617 ip-10-0-129-55 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 06:24:08.384067 ip-10-0-129-55 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 06:24:08.384090 ip-10-0-129-55 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8350b6fa0e1643698441a6451dc75081 -- Apr 21 06:26:35.229061 ip-10-0-129-55 systemd[1]: Starting Kubernetes Kubelet... Apr 21 06:26:35.676944 ip-10-0-129-55 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:35.676944 ip-10-0-129-55 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 06:26:35.676944 ip-10-0-129-55 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:35.676944 ip-10-0-129-55 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 06:26:35.676944 ip-10-0-129-55 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:35.677666 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.677575 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 06:26:35.680712 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680695 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:35.680712 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680711 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680715 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680718 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680721 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680737 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680741 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680744 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680747 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680750 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680752 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680755 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680758 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680760 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680763 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680765 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680768 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680770 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680773 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680776 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680778 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:35.680879 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680781 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680783 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680792 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680796 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680798 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680801 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680804 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680806 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680808 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680811 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680814 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680817 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680819 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680822 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680826 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680829 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680832 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680835 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680837 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680840 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:35.681354 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680842 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680845 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680847 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680850 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680853 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680855 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680860 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680864 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680867 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680869 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680872 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680874 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680877 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680880 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680882 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680885 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680887 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680890 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680893 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680895 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:35.681908 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680904 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680907 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680909 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680912 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680914 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680919 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680924 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680928 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680931 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680935 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680938 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680941 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680943 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680946 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680949 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680952 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680955 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680957 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680960 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:35.682394 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680962 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680965 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680968 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680971 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680974 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.680976 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681364 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681369 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681372 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681374 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681377 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681386 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681389 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681394 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681397 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681400 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681403 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681405 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681410 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:35.682876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681413 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681416 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681419 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681421 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681424 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681426 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681429 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681432 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681434 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681437 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681439 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681442 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681444 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681447 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681449 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681452 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681455 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681458 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681460 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681463 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:35.683337 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681466 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681468 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681471 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681473 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681476 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681478 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681480 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681483 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681486 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681488 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681491 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681494 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681497 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681500 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681503 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681505 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681507 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681510 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681512 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681515 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:35.683841 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681517 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681519 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681522 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681524 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681527 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681529 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681531 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681534 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681537 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681540 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681542 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681545 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681547 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681549 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681552 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681555 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681558 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681560 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681563 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681565 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:35.684373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681567 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681571 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681573 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681576 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681579 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681582 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681584 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681586 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681589 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681591 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681594 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681596 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.681598 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682360 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682369 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682376 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682380 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682384 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682388 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682393 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682398 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 06:26:35.684885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682401 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682405 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682408 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682412 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682415 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682418 2577 flags.go:64] FLAG: --cgroup-root="" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682420 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682423 2577 flags.go:64] FLAG: --client-ca-file="" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682426 2577 flags.go:64] FLAG: --cloud-config="" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682429 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682431 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682436 2577 flags.go:64] FLAG: --cluster-domain="" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682439 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682442 2577 flags.go:64] FLAG: --config-dir="" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682445 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682448 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682452 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682456 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682459 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682462 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682465 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682467 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682470 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682473 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682476 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 06:26:35.685397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682480 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682483 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682486 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682488 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682491 2577 flags.go:64] FLAG: --enable-server="true" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682494 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682501 2577 flags.go:64] FLAG: --event-burst="100" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682504 2577 flags.go:64] FLAG: --event-qps="50" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682507 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682510 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682512 2577 flags.go:64] FLAG: --eviction-hard="" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682516 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682519 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682522 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682525 2577 flags.go:64] FLAG: --eviction-soft="" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682528 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682531 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682533 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682536 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682539 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682542 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682545 2577 flags.go:64] FLAG: --feature-gates="" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682548 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682551 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682554 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 06:26:35.686021 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682558 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682561 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682564 2577 flags.go:64] FLAG: --help="false" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682566 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-129-55.ec2.internal" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682570 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682573 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682575 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682578 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682582 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682584 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682587 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682590 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682592 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682596 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682599 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682603 2577 flags.go:64] FLAG: --kube-reserved="" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682606 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682610 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682613 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682616 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682618 2577 flags.go:64] FLAG: --lock-file="" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682621 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682624 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682627 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 06:26:35.686616 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682632 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682635 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682637 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682640 2577 flags.go:64] FLAG: --logging-format="text" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682643 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682646 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682649 2577 flags.go:64] FLAG: --manifest-url="" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682652 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682656 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682659 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682664 2577 flags.go:64] FLAG: --max-pods="110" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682667 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682669 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682672 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682675 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682678 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682681 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682684 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682692 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682695 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682698 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682702 2577 flags.go:64] FLAG: --pod-cidr="" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682705 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 06:26:35.687215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682711 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682714 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682717 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682722 2577 flags.go:64] FLAG: --port="10250" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682737 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682740 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a70f49f920c1cad3" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682743 2577 flags.go:64] FLAG: --qos-reserved="" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682746 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682749 2577 flags.go:64] FLAG: --register-node="true" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682752 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682755 2577 flags.go:64] FLAG: --register-with-taints="" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682763 2577 flags.go:64] FLAG: --registry-burst="10" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682766 2577 flags.go:64] FLAG: --registry-qps="5" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682769 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682771 2577 flags.go:64] FLAG: --reserved-memory="" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682775 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682778 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682781 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682784 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682787 2577 flags.go:64] FLAG: --runonce="false" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682790 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682794 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682796 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682799 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682802 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682805 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 06:26:35.687791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682808 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682811 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682814 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682816 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682821 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682824 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682827 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682830 2577 flags.go:64] FLAG: --system-cgroups="" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682832 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682840 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682843 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682846 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682850 2577 flags.go:64] FLAG: --tls-min-version="" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682852 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682855 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682858 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682861 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682864 2577 flags.go:64] FLAG: --v="2" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682868 2577 flags.go:64] FLAG: --version="false" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682872 2577 flags.go:64] FLAG: --vmodule="" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682876 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.682880 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.682973 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.682977 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:35.688414 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.682981 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.682983 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.682987 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.682991 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.682994 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.682997 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683000 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683003 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683006 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683009 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683011 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683014 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683018 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683021 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683023 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683026 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683028 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683032 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683035 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:35.689011 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683037 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683040 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683043 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683045 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683048 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683050 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683053 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683055 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683058 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683060 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683063 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683065 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683068 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683070 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683072 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683075 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683078 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683080 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683083 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683085 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:35.689518 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683088 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683090 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683093 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683095 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683098 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683101 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683104 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683106 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683110 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683113 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683117 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683119 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683122 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683124 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683127 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683129 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683131 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683134 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683136 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683139 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:35.690020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683141 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683143 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683146 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683148 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683151 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683153 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683156 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683158 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683161 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683164 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683166 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683169 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683171 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683174 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683176 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683179 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683181 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683185 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683187 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:35.690530 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683190 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683192 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683194 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683198 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683201 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.683203 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.683208 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.690765 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.690784 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690831 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690836 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690840 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690843 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690846 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690850 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690852 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:35.691020 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690855 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690857 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690860 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690862 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690865 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690867 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690870 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690873 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690876 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690878 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690881 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690883 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690886 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690889 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690891 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690894 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690897 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690899 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690902 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690904 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:35.691422 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690907 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690911 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690915 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690917 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690921 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690924 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690927 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690929 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690932 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690934 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690937 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690939 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690942 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690944 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690946 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690949 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690951 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690954 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690956 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690959 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:35.691930 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690961 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690963 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690966 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690970 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690974 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690977 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690980 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690983 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690986 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690990 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690992 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690995 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.690997 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691000 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691002 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691005 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691007 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691010 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:35.692427 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691013 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691016 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691018 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691021 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691024 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691026 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691029 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691031 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691034 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691037 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691039 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691042 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691044 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691046 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691049 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691051 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691054 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691057 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691059 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691061 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:35.692898 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691064 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.691069 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691187 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691193 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691197 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691200 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691202 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691205 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691208 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691210 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691212 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691215 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691218 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691221 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691225 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:35.693384 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691228 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691231 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691234 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691237 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691239 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691242 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691245 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691248 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691250 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691253 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691256 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691258 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691261 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691263 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691265 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691268 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691270 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691273 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691275 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:35.693767 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691278 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691280 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691283 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691286 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691288 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691291 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691293 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691296 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691298 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691300 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691303 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691312 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691314 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691317 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691319 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691322 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691324 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691326 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691329 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691331 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:35.694232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691334 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691336 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691338 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691341 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691343 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691345 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691348 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691350 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691353 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691355 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691358 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691361 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691365 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691368 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691370 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691373 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691375 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691377 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691380 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691382 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:35.694748 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691385 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691387 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691390 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691392 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691400 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691403 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691405 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691407 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691410 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691412 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691415 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691417 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691420 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:35.691422 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.691427 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.692060 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 06:26:35.695232 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.694250 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 06:26:35.695635 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.695281 2577 server.go:1019] "Starting client certificate rotation" Apr 21 06:26:35.695635 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.695391 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 06:26:35.695635 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.695425 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 06:26:35.725135 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.725107 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 06:26:35.726952 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.726932 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 06:26:35.739717 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.739699 2577 log.go:25] "Validated CRI v1 runtime API" Apr 21 06:26:35.745292 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.745266 2577 log.go:25] "Validated CRI v1 image API" Apr 21 06:26:35.746454 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.746440 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 06:26:35.749376 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.749358 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 06:26:35.751800 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.751780 2577 fs.go:135] Filesystem UUIDs: map[719dd773-ad35-4da9-be70-e12fb58d62bb:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e324bb09-6418-4b1f-b7a5-b3522579a1d5:/dev/nvme0n1p3] Apr 21 06:26:35.751871 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.751800 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 06:26:35.758826 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.758687 2577 manager.go:217] Machine: {Timestamp:2026-04-21 06:26:35.756814784 +0000 UTC m=+0.409260593 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099791 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec232c54fa9feb776db6c895ac7d4775 SystemUUID:ec232c54-fa9f-eb77-6db6-c895ac7d4775 BootID:8350b6fa-0e16-4369-8441-a6451dc75081 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ac:16:e4:47:c9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ac:16:e4:47:c9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7a:93:ff:ad:f2:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 06:26:35.758826 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.758822 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 06:26:35.758952 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.758940 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 06:26:35.760200 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.760172 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 06:26:35.760356 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.760210 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-55.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 06:26:35.760401 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.760368 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 06:26:35.760401 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.760378 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 06:26:35.760401 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.760390 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 06:26:35.761173 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.761163 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 06:26:35.762047 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.762038 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 21 06:26:35.762174 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.762165 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 06:26:35.764543 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.764533 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 21 06:26:35.764590 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.764547 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 06:26:35.764590 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.764566 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 06:26:35.764590 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.764577 2577 kubelet.go:397] "Adding apiserver pod source" Apr 21 06:26:35.764590 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.764586 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 06:26:35.765767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.765747 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 06:26:35.766565 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.766549 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 06:26:35.770066 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.770049 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 06:26:35.771752 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.771705 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ld948" Apr 21 06:26:35.771888 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.771869 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 06:26:35.773197 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773185 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773208 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773218 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773228 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773234 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773239 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773245 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773250 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773256 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773262 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773271 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 06:26:35.773291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.773279 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 06:26:35.774188 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.774177 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 06:26:35.774240 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.774191 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 06:26:35.775908 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.775890 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-55.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 06:26:35.775982 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.775936 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 06:26:35.775982 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.775957 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 06:26:35.778077 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.778064 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 06:26:35.778130 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.778104 2577 server.go:1295] "Started kubelet" Apr 21 06:26:35.778268 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.778221 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 06:26:35.778347 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.778293 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 06:26:35.778387 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.778371 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 06:26:35.779055 ip-10-0-129-55 systemd[1]: Started Kubernetes Kubelet. Apr 21 06:26:35.779198 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.779048 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ld948" Apr 21 06:26:35.780092 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.780073 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 06:26:35.781076 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.780954 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 21 06:26:35.783160 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.782159 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-55.ec2.internal.18a84b410e08ce2b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-55.ec2.internal,UID:ip-10-0-129-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-55.ec2.internal,},FirstTimestamp:2026-04-21 06:26:35.778076203 +0000 UTC m=+0.430522016,LastTimestamp:2026-04-21 06:26:35.778076203 +0000 UTC m=+0.430522016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-55.ec2.internal,}" Apr 21 06:26:35.785372 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.785359 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 06:26:35.785446 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.785433 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 06:26:35.786088 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.786068 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 06:26:35.786088 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.786069 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 06:26:35.786212 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.786099 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 06:26:35.786297 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.786279 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 21 06:26:35.786297 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.786283 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:35.786297 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.786295 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 21 06:26:35.787020 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.786999 2577 factory.go:55] Registering systemd factory Apr 21 06:26:35.787097 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.787029 2577 factory.go:223] Registration of the systemd container factory successfully Apr 21 06:26:35.787423 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.787406 2577 factory.go:153] Registering CRI-O factory Apr 21 06:26:35.787423 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.787426 2577 factory.go:223] Registration of the crio container factory successfully Apr 21 06:26:35.787549 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.787515 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 06:26:35.787549 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.787543 2577 factory.go:103] Registering Raw factory Apr 21 06:26:35.787645 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.787557 2577 manager.go:1196] Started watching for new ooms in manager Apr 21 06:26:35.788018 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.787998 2577 manager.go:319] Starting recovery of all containers Apr 21 06:26:35.789294 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.789261 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 06:26:35.798571 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.798515 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:35.800690 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.800674 2577 manager.go:324] Recovery completed Apr 21 06:26:35.801007 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.800970 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-55.ec2.internal\" not found" node="ip-10-0-129-55.ec2.internal" Apr 21 06:26:35.804669 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.804655 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:35.807237 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.807221 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:35.807296 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.807251 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:35.807296 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.807261 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:35.807777 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.807763 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 06:26:35.807840 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.807778 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 06:26:35.807840 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.807797 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 21 06:26:35.810028 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.810017 2577 policy_none.go:49] "None policy: Start" Apr 21 06:26:35.810079 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.810033 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 06:26:35.810079 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.810043 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 21 06:26:35.846327 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.846312 2577 manager.go:341] "Starting Device Plugin manager" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.846387 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.846397 2577 server.go:85] "Starting device plugin registration server" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.846648 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.846659 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.846779 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.846866 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.846922 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.847709 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 06:26:35.862516 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.847766 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:35.923987 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.923938 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 06:26:35.925380 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.925356 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 06:26:35.925380 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.925384 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 06:26:35.925520 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.925403 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 06:26:35.925520 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.925410 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 06:26:35.925520 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.925444 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 06:26:35.927813 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.927763 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:35.947570 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.947545 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:35.948325 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.948311 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:35.948398 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.948340 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:35.948398 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.948349 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:35.948398 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.948371 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-55.ec2.internal" Apr 21 06:26:35.955939 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:35.955922 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-55.ec2.internal" Apr 21 06:26:35.955987 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.955945 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-55.ec2.internal\": node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:35.973371 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:35.973339 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.026604 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.026567 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal"] Apr 21 06:26:36.026710 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.026664 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:36.028167 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.028151 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:36.028250 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.028182 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:36.028250 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.028193 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:36.029458 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.029446 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:36.029612 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.029599 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.029650 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.029629 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:36.030229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.030210 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:36.030294 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.030244 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:36.030294 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.030217 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:36.030294 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.030275 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:36.030294 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.030287 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:36.030406 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.030255 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:36.031590 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.031575 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.031684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.031596 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:36.032214 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.032198 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:36.032301 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.032225 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:36.032301 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.032240 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:36.065609 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.065587 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-55.ec2.internal\" not found" node="ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.070383 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.070367 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-55.ec2.internal\" not found" node="ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.074414 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.074399 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.175337 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.175293 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.187656 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.187601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99334a5c8c818a9817b6b449ad62bc6b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal\" (UID: \"99334a5c8c818a9817b6b449ad62bc6b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.187656 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.187631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99334a5c8c818a9817b6b449ad62bc6b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal\" (UID: \"99334a5c8c818a9817b6b449ad62bc6b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.187656 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.187651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8a0a94ae8a820fa7aa720c575cce5d73-config\") pod \"kube-apiserver-proxy-ip-10-0-129-55.ec2.internal\" (UID: \"8a0a94ae8a820fa7aa720c575cce5d73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.275991 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.275951 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.288293 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.288269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99334a5c8c818a9817b6b449ad62bc6b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal\" (UID: \"99334a5c8c818a9817b6b449ad62bc6b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.288344 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.288301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99334a5c8c818a9817b6b449ad62bc6b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal\" (UID: \"99334a5c8c818a9817b6b449ad62bc6b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.288344 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.288317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8a0a94ae8a820fa7aa720c575cce5d73-config\") pod \"kube-apiserver-proxy-ip-10-0-129-55.ec2.internal\" (UID: \"8a0a94ae8a820fa7aa720c575cce5d73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.288407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.288382 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99334a5c8c818a9817b6b449ad62bc6b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal\" (UID: \"99334a5c8c818a9817b6b449ad62bc6b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.288407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.288388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99334a5c8c818a9817b6b449ad62bc6b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal\" (UID: \"99334a5c8c818a9817b6b449ad62bc6b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.288483 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.288364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8a0a94ae8a820fa7aa720c575cce5d73-config\") pod \"kube-apiserver-proxy-ip-10-0-129-55.ec2.internal\" (UID: \"8a0a94ae8a820fa7aa720c575cce5d73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.367479 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.367448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.373130 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.373108 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" Apr 21 06:26:36.376980 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.376960 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.477545 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.477504 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.578014 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.577975 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.678479 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.678447 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.695892 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.695871 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 06:26:36.696027 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.696008 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 06:26:36.696066 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.696034 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 06:26:36.722058 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.722036 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:36.778820 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.778763 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.781891 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.781858 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 06:21:35 +0000 UTC" deadline="2028-01-02 21:19:20.927105302 +0000 UTC" Apr 21 06:26:36.781990 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.781892 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14918h52m44.145217948s" Apr 21 06:26:36.786102 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.786083 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 06:26:36.794374 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.794353 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 06:26:36.816373 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.816351 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cblhf" Apr 21 06:26:36.829786 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.829767 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cblhf" Apr 21 06:26:36.830605 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:36.830565 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a0a94ae8a820fa7aa720c575cce5d73.slice/crio-a0efaad9f25a1063a2132408f8873fc685def952629c6efaf68e055061674d57 WatchSource:0}: Error finding container a0efaad9f25a1063a2132408f8873fc685def952629c6efaf68e055061674d57: Status 404 returned error can't find the container with id a0efaad9f25a1063a2132408f8873fc685def952629c6efaf68e055061674d57 Apr 21 06:26:36.831289 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:36.831269 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99334a5c8c818a9817b6b449ad62bc6b.slice/crio-66a62ffefff62d9bf39ff9b2566645fd371d501cee79ee864507d1b9bd7bfc3a WatchSource:0}: Error finding container 66a62ffefff62d9bf39ff9b2566645fd371d501cee79ee864507d1b9bd7bfc3a: Status 404 returned error can't find the container with id 66a62ffefff62d9bf39ff9b2566645fd371d501cee79ee864507d1b9bd7bfc3a Apr 21 06:26:36.834380 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.834367 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:26:36.879500 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.879467 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:36.929027 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.928977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" event={"ID":"8a0a94ae8a820fa7aa720c575cce5d73","Type":"ContainerStarted","Data":"a0efaad9f25a1063a2132408f8873fc685def952629c6efaf68e055061674d57"} Apr 21 06:26:36.930014 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:36.929988 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" event={"ID":"99334a5c8c818a9817b6b449ad62bc6b","Type":"ContainerStarted","Data":"66a62ffefff62d9bf39ff9b2566645fd371d501cee79ee864507d1b9bd7bfc3a"} Apr 21 06:26:36.980168 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:36.980144 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:37.080745 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:37.080663 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:37.181218 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:37.181180 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-55.ec2.internal\" not found" Apr 21 06:26:37.265892 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.265859 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:37.285804 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.285774 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" Apr 21 06:26:37.297524 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.297499 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 06:26:37.298543 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.298517 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" Apr 21 06:26:37.305756 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.305717 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 06:26:37.647337 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.647302 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:37.765411 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.765374 2577 apiserver.go:52] "Watching apiserver" Apr 21 06:26:37.774329 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.774295 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 06:26:37.774750 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.774712 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-6hkzv","kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z","openshift-dns/node-resolver-2hplw","openshift-image-registry/node-ca-gnb44","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal","openshift-multus/multus-n768c","openshift-network-operator/iptables-alerter-dwt8n","openshift-cluster-node-tuning-operator/tuned-7mw75","openshift-multus/multus-additional-cni-plugins-7vk77","openshift-multus/network-metrics-daemon-276tk","openshift-network-diagnostics/network-check-target-thvnj","openshift-ovn-kubernetes/ovnkube-node-bdm62"] Apr 21 06:26:37.777656 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.777635 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.778832 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.778812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.779597 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.779575 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2nkqr\"" Apr 21 06:26:37.779836 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.779810 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 06:26:37.779836 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.779818 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 06:26:37.780273 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.780074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.781002 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.780985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 06:26:37.781127 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.781115 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:26:37.781269 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.781254 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lb22k\"" Apr 21 06:26:37.781986 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.781965 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 06:26:37.782123 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.782110 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j8zxn\"" Apr 21 06:26:37.782186 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.782144 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 06:26:37.782397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.782376 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 06:26:37.783123 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.783104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n768c" Apr 21 06:26:37.783224 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.783213 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.784916 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.784408 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:37.785328 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.785309 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 06:26:37.785915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.785426 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 06:26:37.785915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.785718 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 06:26:37.785915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.785748 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 06:26:37.785915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.785805 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 06:26:37.785915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.785749 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-284kh\"" Apr 21 06:26:37.785915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.785805 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 06:26:37.786520 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.786337 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 06:26:37.786604 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.786586 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-h2gmf\"" Apr 21 06:26:37.786663 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.786601 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x4kcr\"" Apr 21 06:26:37.787056 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.786847 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.787056 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.786861 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 06:26:37.794242 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.794212 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 06:26:37.795441 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.795420 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:26:37.795597 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.795578 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.795848 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.795829 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6rn2f\"" Apr 21 06:26:37.796136 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796112 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 06:26:37.796136 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796125 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 06:26:37.796280 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.796280 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-system-cni-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.796280 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-cni-multus\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.796280 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d2bbbed-1117-4391-9611-601532f34a73-multus-daemon-config\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.796454 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d222f49e-3ace-4fc2-9344-97a36ac9bc47-serviceca\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.796454 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83b7a9cf-9462-4e2a-901b-482dc68cb898-hosts-file\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.796454 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-lib-modules\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.796454 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-var-lib-kubelet\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.796454 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-k8s-cni-cncf-io\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.796652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9f0218ab-e007-4a8a-a5d9-1682337a814f-iptables-alerter-script\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.796652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d25f7\" (UniqueName: \"kubernetes.io/projected/9f0218ab-e007-4a8a-a5d9-1682337a814f-kube-api-access-d25f7\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.796652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-tuned\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.796652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-socket-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.796652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdrg2\" (UniqueName: \"kubernetes.io/projected/1bb3c06d-f4bc-4567-9958-978a3b9398c2-kube-api-access-jdrg2\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.796652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d2bbbed-1117-4391-9611-601532f34a73-cni-binary-copy\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.796652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-socket-dir-parent\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.796652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796621 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-netns\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-multus-certs\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d222f49e-3ace-4fc2-9344-97a36ac9bc47-host\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-modprobe-d\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysconfig\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-kubernetes\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796819 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-systemd\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-os-release\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1448effd-0e95-4c37-bb1b-6de12bbf9fd9-agent-certs\") pod \"konnectivity-agent-6hkzv\" (UID: \"1448effd-0e95-4c37-bb1b-6de12bbf9fd9\") " pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1448effd-0e95-4c37-bb1b-6de12bbf9fd9-konnectivity-ca\") pod \"konnectivity-agent-6hkzv\" (UID: \"1448effd-0e95-4c37-bb1b-6de12bbf9fd9\") " pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796894 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysctl-d\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-registration-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796961 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.796984 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-cni-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797041 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:37.796979 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-cni-bin\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797081 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-hostroot\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-etc-kubernetes\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdpdt\" (UniqueName: \"kubernetes.io/projected/d222f49e-3ace-4fc2-9344-97a36ac9bc47-kube-api-access-qdpdt\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-sys\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-sys-fs\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-cnibin\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f0218ab-e007-4a8a-a5d9-1682337a814f-host-slash\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5w72\" (UniqueName: \"kubernetes.io/projected/83b7a9cf-9462-4e2a-901b-482dc68cb898-kube-api-access-z5w72\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-host\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797296 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45bfe8d3-3836-468f-bde5-17d1c54e53a8-tmp\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs4sq\" (UniqueName: \"kubernetes.io/projected/45bfe8d3-3836-468f-bde5-17d1c54e53a8-kube-api-access-rs4sq\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-device-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-kubelet\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-conf-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cr55\" (UniqueName: \"kubernetes.io/projected/7d2bbbed-1117-4391-9611-601532f34a73-kube-api-access-5cr55\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.797651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83b7a9cf-9462-4e2a-901b-482dc68cb898-tmp-dir\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.798421 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysctl-conf\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.798421 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797477 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-run\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.798421 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.797662 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 06:26:37.798421 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.798000 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 06:26:37.798421 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.798051 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9dlfm\"" Apr 21 06:26:37.798577 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.798505 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:37.798577 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:37.798568 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:37.799881 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.799863 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.802161 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.802015 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 06:26:37.802161 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.802015 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 06:26:37.802161 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.802045 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 06:26:37.802161 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.802062 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vgk92\"" Apr 21 06:26:37.802764 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.802746 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 06:26:37.803063 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.803046 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 06:26:37.803124 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.803060 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 06:26:37.830503 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.830479 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:21:36 +0000 UTC" deadline="2027-11-09 15:35:34.102279866 +0000 UTC" Apr 21 06:26:37.830606 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.830504 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13617h8m56.271779244s" Apr 21 06:26:37.887157 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.887129 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 06:26:37.898262 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-systemd-units\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.898262 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovnkube-script-lib\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.898262 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5w72\" (UniqueName: \"kubernetes.io/projected/83b7a9cf-9462-4e2a-901b-482dc68cb898-kube-api-access-z5w72\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45bfe8d3-3836-468f-bde5-17d1c54e53a8-tmp\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-kubelet\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cr55\" (UniqueName: \"kubernetes.io/projected/7d2bbbed-1117-4391-9611-601532f34a73-kube-api-access-5cr55\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-cni-netd\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898354 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-env-overrides\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83b7a9cf-9462-4e2a-901b-482dc68cb898-tmp-dir\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-kubelet\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysctl-conf\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-run\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.898491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-system-cni-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-cni-multus\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-run\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d222f49e-3ace-4fc2-9344-97a36ac9bc47-serviceca\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-var-lib-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-cni-multus\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-system-cni-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-log-socket\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9f0218ab-e007-4a8a-a5d9-1682337a814f-iptables-alerter-script\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysctl-conf\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d65mb\" (UniqueName: \"kubernetes.io/projected/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-kube-api-access-d65mb\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898712 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-tuned\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-socket-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83b7a9cf-9462-4e2a-901b-482dc68cb898-tmp-dir\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdrg2\" (UniqueName: \"kubernetes.io/projected/1bb3c06d-f4bc-4567-9958-978a3b9398c2-kube-api-access-jdrg2\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.899038 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d2bbbed-1117-4391-9611-601532f34a73-cni-binary-copy\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-socket-dir-parent\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-netns\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-multus-certs\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-socket-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d222f49e-3ace-4fc2-9344-97a36ac9bc47-host\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898979 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d222f49e-3ace-4fc2-9344-97a36ac9bc47-host\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.898986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysconfig\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-kubernetes\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899028 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-netns\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d222f49e-3ace-4fc2-9344-97a36ac9bc47-serviceca\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-kubernetes\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-node-log\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-multus-certs\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-socket-dir-parent\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.899984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysconfig\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899186 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-run-ovn-kubernetes\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-os-release\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899223 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9f0218ab-e007-4a8a-a5d9-1682337a814f-iptables-alerter-script\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysctl-d\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-registration-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899293 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-os-release\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-hostroot\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdpdt\" (UniqueName: \"kubernetes.io/projected/d222f49e-3ace-4fc2-9344-97a36ac9bc47-kube-api-access-qdpdt\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899340 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-sysctl-d\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovn-node-metrics-cert\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-hostroot\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-sys-fs\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-ovn\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovnkube-config\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-os-release\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-host\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.900850 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-sys-fs\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs4sq\" (UniqueName: \"kubernetes.io/projected/45bfe8d3-3836-468f-bde5-17d1c54e53a8-kube-api-access-rs4sq\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-device-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-registration-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-host\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-conf-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-device-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d2bbbed-1117-4391-9611-601532f34a73-cni-binary-copy\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-system-cni-dir\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899646 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-conf-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cnibin\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhxwx\" (UniqueName: \"kubernetes.io/projected/14257089-c0ac-4007-81fc-ff9a9034e71b-kube-api-access-zhxwx\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d2bbbed-1117-4391-9611-601532f34a73-multus-daemon-config\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d25f7\" (UniqueName: \"kubernetes.io/projected/9f0218ab-e007-4a8a-a5d9-1682337a814f-kube-api-access-d25f7\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-slash\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.901558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899863 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7f8d\" (UniqueName: \"kubernetes.io/projected/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-kube-api-access-c7f8d\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83b7a9cf-9462-4e2a-901b-482dc68cb898-hosts-file\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-lib-modules\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-var-lib-kubelet\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83b7a9cf-9462-4e2a-901b-482dc68cb898-hosts-file\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-k8s-cni-cncf-io\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-systemd\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.899997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-var-lib-kubelet\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-cni-bin\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900038 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-run-k8s-cni-cncf-io\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-lib-modules\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-etc-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-modprobe-d\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-modprobe-d\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-systemd\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d2bbbed-1117-4391-9611-601532f34a73-multus-daemon-config\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902229 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1448effd-0e95-4c37-bb1b-6de12bbf9fd9-agent-certs\") pod \"konnectivity-agent-6hkzv\" (UID: \"1448effd-0e95-4c37-bb1b-6de12bbf9fd9\") " pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1448effd-0e95-4c37-bb1b-6de12bbf9fd9-konnectivity-ca\") pod \"konnectivity-agent-6hkzv\" (UID: \"1448effd-0e95-4c37-bb1b-6de12bbf9fd9\") " pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-systemd\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900271 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cni-binary-copy\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-kubelet\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-run-netns\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-cni-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-cni-bin\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1bb3c06d-f4bc-4567-9958-978a3b9398c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-multus-cni-dir\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-host-var-lib-cni-bin\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-etc-kubernetes\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900533 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-etc-kubernetes\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-sys\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.902811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-cnibin\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.903482 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f0218ab-e007-4a8a-a5d9-1682337a814f-host-slash\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.903482 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/45bfe8d3-3836-468f-bde5-17d1c54e53a8-sys\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.903482 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900676 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f0218ab-e007-4a8a-a5d9-1682337a814f-host-slash\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:37.903482 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d2bbbed-1117-4391-9611-601532f34a73-cnibin\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.903482 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.900771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1448effd-0e95-4c37-bb1b-6de12bbf9fd9-konnectivity-ca\") pod \"konnectivity-agent-6hkzv\" (UID: \"1448effd-0e95-4c37-bb1b-6de12bbf9fd9\") " pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:37.903482 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.901493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/45bfe8d3-3836-468f-bde5-17d1c54e53a8-etc-tuned\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.903482 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.901571 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45bfe8d3-3836-468f-bde5-17d1c54e53a8-tmp\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.903482 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.902639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1448effd-0e95-4c37-bb1b-6de12bbf9fd9-agent-certs\") pod \"konnectivity-agent-6hkzv\" (UID: \"1448effd-0e95-4c37-bb1b-6de12bbf9fd9\") " pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:37.910614 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.910585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5w72\" (UniqueName: \"kubernetes.io/projected/83b7a9cf-9462-4e2a-901b-482dc68cb898-kube-api-access-z5w72\") pod \"node-resolver-2hplw\" (UID: \"83b7a9cf-9462-4e2a-901b-482dc68cb898\") " pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:37.910750 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.910590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdpdt\" (UniqueName: \"kubernetes.io/projected/d222f49e-3ace-4fc2-9344-97a36ac9bc47-kube-api-access-qdpdt\") pod \"node-ca-gnb44\" (UID: \"d222f49e-3ace-4fc2-9344-97a36ac9bc47\") " pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:37.911119 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.911094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdrg2\" (UniqueName: \"kubernetes.io/projected/1bb3c06d-f4bc-4567-9958-978a3b9398c2-kube-api-access-jdrg2\") pod \"aws-ebs-csi-driver-node-nsw5z\" (UID: \"1bb3c06d-f4bc-4567-9958-978a3b9398c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:37.911230 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.911215 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cr55\" (UniqueName: \"kubernetes.io/projected/7d2bbbed-1117-4391-9611-601532f34a73-kube-api-access-5cr55\") pod \"multus-n768c\" (UID: \"7d2bbbed-1117-4391-9611-601532f34a73\") " pod="openshift-multus/multus-n768c" Apr 21 06:26:37.911794 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.911759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs4sq\" (UniqueName: \"kubernetes.io/projected/45bfe8d3-3836-468f-bde5-17d1c54e53a8-kube-api-access-rs4sq\") pod \"tuned-7mw75\" (UID: \"45bfe8d3-3836-468f-bde5-17d1c54e53a8\") " pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:37.912828 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:37.912807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d25f7\" (UniqueName: \"kubernetes.io/projected/9f0218ab-e007-4a8a-a5d9-1682337a814f-kube-api-access-d25f7\") pod \"iptables-alerter-dwt8n\" (UID: \"9f0218ab-e007-4a8a-a5d9-1682337a814f\") " pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:38.001888 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.001852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.002091 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.001901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cni-binary-copy\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.002091 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.002091 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.002253 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-kubelet\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002253 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-run-netns\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002253 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002253 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-systemd-units\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002253 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovnkube-script-lib\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002253 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-run-netns\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002253 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-kubelet\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-cni-netd\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-systemd-units\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002294 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-cni-netd\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-env-overrides\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-var-lib-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-log-socket\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d65mb\" (UniqueName: \"kubernetes.io/projected/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-kube-api-access-d65mb\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-var-lib-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cni-binary-copy\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-node-log\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-run-ovn-kubernetes\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.002580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-log-socket\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovn-node-metrics-cert\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-ovn\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovnkube-config\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-os-release\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-system-cni-dir\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002672 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cnibin\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002688 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhxwx\" (UniqueName: \"kubernetes.io/projected/14257089-c0ac-4007-81fc-ff9a9034e71b-kube-api-access-zhxwx\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-slash\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7f8d\" (UniqueName: \"kubernetes.io/projected/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-kube-api-access-c7f8d\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-env-overrides\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-systemd\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-cni-bin\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-etc-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002832 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovnkube-script-lib\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002835 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-os-release\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-etc-openvswitch\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003970 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-ovn\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003970 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-system-cni-dir\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.003970 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.002657 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:38.003970 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-node-log\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003970 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.002988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-run-ovn-kubernetes\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.003970 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.003544 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs podName:14257089-c0ac-4007-81fc-ff9a9034e71b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:38.503482901 +0000 UTC m=+3.155928700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs") pod "network-metrics-daemon-276tk" (UID: "14257089-c0ac-4007-81fc-ff9a9034e71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:38.003970 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.003897 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cnibin\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.003970 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.003939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.004435 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.003998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovnkube-config\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.004435 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.004046 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-cni-bin\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.004435 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.004089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.004435 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.004105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-host-slash\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.004435 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.004094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-run-systemd\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.008465 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.008446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-ovn-node-metrics-cert\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.009696 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.009675 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:38.009812 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.009700 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:38.009812 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.009715 2577 projected.go:194] Error preparing data for projected volume kube-api-access-plqrx for pod openshift-network-diagnostics/network-check-target-thvnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:38.009812 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.009794 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx podName:7f77e68e-f3ad-422e-af2d-685ee3a97eaa nodeName:}" failed. No retries permitted until 2026-04-21 06:26:38.509780997 +0000 UTC m=+3.162226794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-plqrx" (UniqueName: "kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx") pod "network-check-target-thvnj" (UID: "7f77e68e-f3ad-422e-af2d-685ee3a97eaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:38.010984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.010966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d65mb\" (UniqueName: \"kubernetes.io/projected/5a9984cb-8f3e-47c2-b7d5-3612ff658e70-kube-api-access-d65mb\") pod \"multus-additional-cni-plugins-7vk77\" (UID: \"5a9984cb-8f3e-47c2-b7d5-3612ff658e70\") " pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.011902 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.011886 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhxwx\" (UniqueName: \"kubernetes.io/projected/14257089-c0ac-4007-81fc-ff9a9034e71b-kube-api-access-zhxwx\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:38.012469 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.012450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7f8d\" (UniqueName: \"kubernetes.io/projected/be794aa6-58e2-4d4d-b76c-e85f84c36d7e-kube-api-access-c7f8d\") pod \"ovnkube-node-bdm62\" (UID: \"be794aa6-58e2-4d4d-b76c-e85f84c36d7e\") " pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.097955 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.097920 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2hplw" Apr 21 06:26:38.103893 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.103868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7mw75" Apr 21 06:26:38.111495 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.111474 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" Apr 21 06:26:38.117091 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.117069 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n768c" Apr 21 06:26:38.123651 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.123632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gnb44" Apr 21 06:26:38.130220 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.130201 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:38.138813 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.138786 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dwt8n" Apr 21 06:26:38.144375 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.144352 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7vk77" Apr 21 06:26:38.150068 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.150015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:26:38.286641 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.286608 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:38.468383 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:38.468183 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe794aa6_58e2_4d4d_b76c_e85f84c36d7e.slice/crio-84bdb63de0e7b05e1debf5ac35c5b43fb5275956119460d8840f9db05dd4efe4 WatchSource:0}: Error finding container 84bdb63de0e7b05e1debf5ac35c5b43fb5275956119460d8840f9db05dd4efe4: Status 404 returned error can't find the container with id 84bdb63de0e7b05e1debf5ac35c5b43fb5275956119460d8840f9db05dd4efe4 Apr 21 06:26:38.470085 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:38.470039 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0218ab_e007_4a8a_a5d9_1682337a814f.slice/crio-41d5f2ff3d4d104180c1b1caa9f13a0a797cb7f0d4379add88589e4af134c962 WatchSource:0}: Error finding container 41d5f2ff3d4d104180c1b1caa9f13a0a797cb7f0d4379add88589e4af134c962: Status 404 returned error can't find the container with id 41d5f2ff3d4d104180c1b1caa9f13a0a797cb7f0d4379add88589e4af134c962 Apr 21 06:26:38.470799 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:38.470781 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45bfe8d3_3836_468f_bde5_17d1c54e53a8.slice/crio-7661b45fd26880f6d85d3c7bfaf4820e79285ccb0a7769e221523d8dd6296580 WatchSource:0}: Error finding container 7661b45fd26880f6d85d3c7bfaf4820e79285ccb0a7769e221523d8dd6296580: Status 404 returned error can't find the container with id 7661b45fd26880f6d85d3c7bfaf4820e79285ccb0a7769e221523d8dd6296580 Apr 21 06:26:38.474148 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:38.474120 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1448effd_0e95_4c37_bb1b_6de12bbf9fd9.slice/crio-f79eb09431e88d96b0968ee8287277b92e699fe2b26e1f7d9a9b9b489b5199ba WatchSource:0}: Error finding container f79eb09431e88d96b0968ee8287277b92e699fe2b26e1f7d9a9b9b489b5199ba: Status 404 returned error can't find the container with id f79eb09431e88d96b0968ee8287277b92e699fe2b26e1f7d9a9b9b489b5199ba Apr 21 06:26:38.475380 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:38.475211 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83b7a9cf_9462_4e2a_901b_482dc68cb898.slice/crio-ab9f2e9e870f290f90a644f63962c84f68a731f6e33c0a9dc18908a354ff91c9 WatchSource:0}: Error finding container ab9f2e9e870f290f90a644f63962c84f68a731f6e33c0a9dc18908a354ff91c9: Status 404 returned error can't find the container with id ab9f2e9e870f290f90a644f63962c84f68a731f6e33c0a9dc18908a354ff91c9 Apr 21 06:26:38.476001 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:26:38.475961 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9984cb_8f3e_47c2_b7d5_3612ff658e70.slice/crio-a987fe684351b447b1285b07be3d83838fecbacee41d9b2601d0aaa164bb8b32 WatchSource:0}: Error finding container a987fe684351b447b1285b07be3d83838fecbacee41d9b2601d0aaa164bb8b32: Status 404 returned error can't find the container with id a987fe684351b447b1285b07be3d83838fecbacee41d9b2601d0aaa164bb8b32 Apr 21 06:26:38.504972 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.504944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:38.505092 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.505071 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:38.505180 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.505138 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs podName:14257089-c0ac-4007-81fc-ff9a9034e71b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:39.505118541 +0000 UTC m=+4.157564339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs") pod "network-metrics-daemon-276tk" (UID: "14257089-c0ac-4007-81fc-ff9a9034e71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:38.605583 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.605548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:38.605787 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.605698 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:38.605787 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.605720 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:38.605787 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.605753 2577 projected.go:194] Error preparing data for projected volume kube-api-access-plqrx for pod openshift-network-diagnostics/network-check-target-thvnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:38.605955 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.605816 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx podName:7f77e68e-f3ad-422e-af2d-685ee3a97eaa nodeName:}" failed. No retries permitted until 2026-04-21 06:26:39.605797666 +0000 UTC m=+4.258243482 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-plqrx" (UniqueName: "kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx") pod "network-check-target-thvnj" (UID: "7f77e68e-f3ad-422e-af2d-685ee3a97eaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:38.831294 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.831240 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:21:36 +0000 UTC" deadline="2027-10-11 19:00:17.222260299 +0000 UTC" Apr 21 06:26:38.831294 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.831283 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12924h33m38.390981974s" Apr 21 06:26:38.925822 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.925793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:38.925987 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:38.925963 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:38.939578 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.939453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n768c" event={"ID":"7d2bbbed-1117-4391-9611-601532f34a73","Type":"ContainerStarted","Data":"3d52611c62f218e8af287b1a42e7362121c5b9b18b781c9f9e7fdb434bfcc40b"} Apr 21 06:26:38.941466 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.941416 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2hplw" event={"ID":"83b7a9cf-9462-4e2a-901b-482dc68cb898","Type":"ContainerStarted","Data":"ab9f2e9e870f290f90a644f63962c84f68a731f6e33c0a9dc18908a354ff91c9"} Apr 21 06:26:38.944244 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.944176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" event={"ID":"1bb3c06d-f4bc-4567-9958-978a3b9398c2","Type":"ContainerStarted","Data":"16bd41bf802efcb29bf582ceb52d0c2df0e401dc2b55a73f06745a01d1ad2d98"} Apr 21 06:26:38.948503 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.948444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gnb44" event={"ID":"d222f49e-3ace-4fc2-9344-97a36ac9bc47","Type":"ContainerStarted","Data":"9d0e46a56443c18628a130007ec0bdbc21940da49aa6843f6b5e5ffee728da30"} Apr 21 06:26:38.952066 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.952040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dwt8n" event={"ID":"9f0218ab-e007-4a8a-a5d9-1682337a814f","Type":"ContainerStarted","Data":"41d5f2ff3d4d104180c1b1caa9f13a0a797cb7f0d4379add88589e4af134c962"} Apr 21 06:26:38.964594 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.964567 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"84bdb63de0e7b05e1debf5ac35c5b43fb5275956119460d8840f9db05dd4efe4"} Apr 21 06:26:38.969124 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.968455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" event={"ID":"8a0a94ae8a820fa7aa720c575cce5d73","Type":"ContainerStarted","Data":"71a923262fda0892416fe103c61645ac52db6490cf819180ed31259b6fb1d295"} Apr 21 06:26:38.971368 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.971257 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerStarted","Data":"a987fe684351b447b1285b07be3d83838fecbacee41d9b2601d0aaa164bb8b32"} Apr 21 06:26:38.985388 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.985125 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6hkzv" event={"ID":"1448effd-0e95-4c37-bb1b-6de12bbf9fd9","Type":"ContainerStarted","Data":"f79eb09431e88d96b0968ee8287277b92e699fe2b26e1f7d9a9b9b489b5199ba"} Apr 21 06:26:38.993001 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:38.992974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7mw75" event={"ID":"45bfe8d3-3836-468f-bde5-17d1c54e53a8","Type":"ContainerStarted","Data":"7661b45fd26880f6d85d3c7bfaf4820e79285ccb0a7769e221523d8dd6296580"} Apr 21 06:26:39.517280 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:39.517242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:39.517460 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:39.517402 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:39.517521 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:39.517489 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs podName:14257089-c0ac-4007-81fc-ff9a9034e71b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:41.517469642 +0000 UTC m=+6.169915454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs") pod "network-metrics-daemon-276tk" (UID: "14257089-c0ac-4007-81fc-ff9a9034e71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:39.617999 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:39.617902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:39.618147 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:39.618117 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:39.618217 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:39.618150 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:39.618217 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:39.618163 2577 projected.go:194] Error preparing data for projected volume kube-api-access-plqrx for pod openshift-network-diagnostics/network-check-target-thvnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:39.618324 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:39.618220 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx podName:7f77e68e-f3ad-422e-af2d-685ee3a97eaa nodeName:}" failed. No retries permitted until 2026-04-21 06:26:41.618201987 +0000 UTC m=+6.270647808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-plqrx" (UniqueName: "kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx") pod "network-check-target-thvnj" (UID: "7f77e68e-f3ad-422e-af2d-685ee3a97eaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:39.926626 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:39.926546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:39.928933 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:39.928864 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:40.016757 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:40.015124 2577 generic.go:358] "Generic (PLEG): container finished" podID="99334a5c8c818a9817b6b449ad62bc6b" containerID="d66920b9a0a46e28d96a55bce0aea63e443cdc6b49b156dc66b192a03a02bce7" exitCode=0 Apr 21 06:26:40.016757 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:40.015683 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" event={"ID":"99334a5c8c818a9817b6b449ad62bc6b","Type":"ContainerDied","Data":"d66920b9a0a46e28d96a55bce0aea63e443cdc6b49b156dc66b192a03a02bce7"} Apr 21 06:26:40.031117 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:40.030779 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-55.ec2.internal" podStartSLOduration=3.030760241 podStartE2EDuration="3.030760241s" podCreationTimestamp="2026-04-21 06:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:26:38.982279076 +0000 UTC m=+3.634724896" watchObservedRunningTime="2026-04-21 06:26:40.030760241 +0000 UTC m=+4.683206061" Apr 21 06:26:40.926580 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:40.926122 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:40.926580 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:40.926246 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:41.021895 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:41.021856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" event={"ID":"99334a5c8c818a9817b6b449ad62bc6b","Type":"ContainerStarted","Data":"4fc0690c67a041f9d342cc3f6ffad8136cd8b185035e41bf19fffe56f5333663"} Apr 21 06:26:41.036740 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:41.036665 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-55.ec2.internal" podStartSLOduration=4.036644622 podStartE2EDuration="4.036644622s" podCreationTimestamp="2026-04-21 06:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:26:41.035443305 +0000 UTC m=+5.687889126" watchObservedRunningTime="2026-04-21 06:26:41.036644622 +0000 UTC m=+5.689090441" Apr 21 06:26:41.539206 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:41.539161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:41.539383 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:41.539338 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:41.539451 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:41.539406 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs podName:14257089-c0ac-4007-81fc-ff9a9034e71b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:45.539385474 +0000 UTC m=+10.191831282 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs") pod "network-metrics-daemon-276tk" (UID: "14257089-c0ac-4007-81fc-ff9a9034e71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:41.640232 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:41.640189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:41.640398 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:41.640373 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:41.640398 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:41.640393 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:41.640525 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:41.640408 2577 projected.go:194] Error preparing data for projected volume kube-api-access-plqrx for pod openshift-network-diagnostics/network-check-target-thvnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:41.640525 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:41.640484 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx podName:7f77e68e-f3ad-422e-af2d-685ee3a97eaa nodeName:}" failed. No retries permitted until 2026-04-21 06:26:45.64046573 +0000 UTC m=+10.292911527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-plqrx" (UniqueName: "kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx") pod "network-check-target-thvnj" (UID: "7f77e68e-f3ad-422e-af2d-685ee3a97eaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:41.925908 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:41.925832 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:41.926034 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:41.925975 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:42.926560 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:42.926519 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:42.927030 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:42.926649 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:43.926614 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:43.926578 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:43.927047 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:43.926740 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:44.928750 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:44.926271 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:44.928750 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:44.926408 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:45.571689 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:45.571646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:45.571893 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:45.571830 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:45.571893 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:45.571893 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs podName:14257089-c0ac-4007-81fc-ff9a9034e71b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:53.571874153 +0000 UTC m=+18.224319953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs") pod "network-metrics-daemon-276tk" (UID: "14257089-c0ac-4007-81fc-ff9a9034e71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:45.672594 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:45.672552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:45.672773 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:45.672752 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:45.672865 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:45.672775 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:45.672865 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:45.672789 2577 projected.go:194] Error preparing data for projected volume kube-api-access-plqrx for pod openshift-network-diagnostics/network-check-target-thvnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:45.672865 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:45.672848 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx podName:7f77e68e-f3ad-422e-af2d-685ee3a97eaa nodeName:}" failed. No retries permitted until 2026-04-21 06:26:53.672829851 +0000 UTC m=+18.325275653 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-plqrx" (UniqueName: "kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx") pod "network-check-target-thvnj" (UID: "7f77e68e-f3ad-422e-af2d-685ee3a97eaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:45.928448 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:45.927932 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:45.928448 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:45.928070 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:46.926477 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:46.926442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:46.926944 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:46.926574 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:47.925801 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:47.925769 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:47.925959 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:47.925883 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:48.926619 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:48.926581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:48.927087 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:48.926712 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:49.926013 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:49.925967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:49.926201 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:49.926114 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:50.926104 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:50.926074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:50.926487 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:50.926187 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:51.928922 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:51.928894 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:51.929361 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:51.929012 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:52.925644 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:52.925603 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:52.925840 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:52.925746 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:53.629429 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:53.629392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:53.629876 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:53.629535 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:53.629876 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:53.629597 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs podName:14257089-c0ac-4007-81fc-ff9a9034e71b nodeName:}" failed. No retries permitted until 2026-04-21 06:27:09.629579011 +0000 UTC m=+34.282024820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs") pod "network-metrics-daemon-276tk" (UID: "14257089-c0ac-4007-81fc-ff9a9034e71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:53.730416 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:53.730374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:53.730713 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:53.730533 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:53.730713 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:53.730556 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:53.730713 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:53.730573 2577 projected.go:194] Error preparing data for projected volume kube-api-access-plqrx for pod openshift-network-diagnostics/network-check-target-thvnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:53.730713 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:53.730624 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx podName:7f77e68e-f3ad-422e-af2d-685ee3a97eaa nodeName:}" failed. No retries permitted until 2026-04-21 06:27:09.73061043 +0000 UTC m=+34.383056227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-plqrx" (UniqueName: "kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx") pod "network-check-target-thvnj" (UID: "7f77e68e-f3ad-422e-af2d-685ee3a97eaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:53.926201 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:53.926130 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:53.926352 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:53.926241 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:54.926150 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:54.926117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:54.926609 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:54.926228 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:55.932336 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:55.932125 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:55.932712 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:55.932435 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:56.050712 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:56.050677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6hkzv" event={"ID":"1448effd-0e95-4c37-bb1b-6de12bbf9fd9","Type":"ContainerStarted","Data":"fc22ad2d6cf6cf6de84388ad12558c461020dc146bddfafa2bcc2fe691424d43"} Apr 21 06:26:56.055270 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:56.054696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7mw75" event={"ID":"45bfe8d3-3836-468f-bde5-17d1c54e53a8","Type":"ContainerStarted","Data":"1e2d5abe454efd05a25f46a9dd4b6061f34147de4f9d0eb063d8ff3738eb183b"} Apr 21 06:26:56.077645 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:56.077596 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6hkzv" podStartSLOduration=10.794197429 podStartE2EDuration="20.077578813s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.476630028 +0000 UTC m=+3.129075824" lastFinishedPulling="2026-04-21 06:26:47.760011394 +0000 UTC m=+12.412457208" observedRunningTime="2026-04-21 06:26:56.064217364 +0000 UTC m=+20.716663186" watchObservedRunningTime="2026-04-21 06:26:56.077578813 +0000 UTC m=+20.730024629" Apr 21 06:26:56.078010 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:56.077983 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7mw75" podStartSLOduration=3.76232763 podStartE2EDuration="21.077974621s" podCreationTimestamp="2026-04-21 06:26:35 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.472904427 +0000 UTC m=+3.125350223" lastFinishedPulling="2026-04-21 06:26:55.788551409 +0000 UTC m=+20.440997214" observedRunningTime="2026-04-21 06:26:56.077719076 +0000 UTC m=+20.730164887" watchObservedRunningTime="2026-04-21 06:26:56.077974621 +0000 UTC m=+20.730420440" Apr 21 06:26:56.926243 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:56.926053 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:56.926382 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:56.926319 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:57.057562 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.057529 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a9984cb-8f3e-47c2-b7d5-3612ff658e70" containerID="233da66872f9f47917055e6ead57eb2617bb828b3779959512542967e3f5b1d3" exitCode=0 Apr 21 06:26:57.058456 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.057632 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerDied","Data":"233da66872f9f47917055e6ead57eb2617bb828b3779959512542967e3f5b1d3"} Apr 21 06:26:57.059007 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.058988 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n768c" event={"ID":"7d2bbbed-1117-4391-9611-601532f34a73","Type":"ContainerStarted","Data":"41c8ffd661985850a6d41028bfe2ff3f0e1b0a349158e668669114ac90bb838b"} Apr 21 06:26:57.060171 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.060151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2hplw" event={"ID":"83b7a9cf-9462-4e2a-901b-482dc68cb898","Type":"ContainerStarted","Data":"a3682c8078168b27948a32c331687bac85a25a4afc7ef09c78091c0a9d692510"} Apr 21 06:26:57.061355 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.061336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" event={"ID":"1bb3c06d-f4bc-4567-9958-978a3b9398c2","Type":"ContainerStarted","Data":"d220d915f636306b1813d12861ec53b3e206b23169548935381a64c9cdb390a1"} Apr 21 06:26:57.062553 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.062535 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gnb44" event={"ID":"d222f49e-3ace-4fc2-9344-97a36ac9bc47","Type":"ContainerStarted","Data":"e3e2bc99b1a15a6e2c03ab0f0b42c4009cd7532cfd7f63889018767d4d49e3d0"} Apr 21 06:26:57.064684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.064669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:26:57.064985 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.064969 2577 generic.go:358] "Generic (PLEG): container finished" podID="be794aa6-58e2-4d4d-b76c-e85f84c36d7e" containerID="bce6a5209ddeff42ed82538ba9942907e23e339a46040b205d60f4702113a7d5" exitCode=1 Apr 21 06:26:57.065045 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.065030 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"e6af62bf9e93e7f6eb837586876f01c767a25b04bb563674acb6934f9563de50"} Apr 21 06:26:57.065082 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.065051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"82d6f3e33aa150a50949eef00ce3fee521b47860b3433b34777c9e701833add9"} Apr 21 06:26:57.065082 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.065060 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"fd437434b24295f4e426c50641b2251ded1e35247373ac9bdb408ded2477d220"} Apr 21 06:26:57.065082 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.065069 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"1b59d8aa5ba5383062ee863c9ed46a9734e7d822e16d176465d16cce3365d1da"} Apr 21 06:26:57.065082 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.065077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerDied","Data":"bce6a5209ddeff42ed82538ba9942907e23e339a46040b205d60f4702113a7d5"} Apr 21 06:26:57.065201 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.065086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"ec27b0a4e33f21ffc0387ac4d648d1d05ec5412159f59611a7d0a24f5e244d85"} Apr 21 06:26:57.092060 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.091991 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n768c" podStartSLOduration=3.743703347 podStartE2EDuration="21.091979595s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.480071535 +0000 UTC m=+3.132517337" lastFinishedPulling="2026-04-21 06:26:55.828347776 +0000 UTC m=+20.480793585" observedRunningTime="2026-04-21 06:26:57.091825384 +0000 UTC m=+21.744271204" watchObservedRunningTime="2026-04-21 06:26:57.091979595 +0000 UTC m=+21.744425414" Apr 21 06:26:57.115773 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.115716 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2hplw" podStartSLOduration=4.804182922 podStartE2EDuration="22.115703274s" podCreationTimestamp="2026-04-21 06:26:35 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.477011726 +0000 UTC m=+3.129457524" lastFinishedPulling="2026-04-21 06:26:55.788532078 +0000 UTC m=+20.440977876" observedRunningTime="2026-04-21 06:26:57.115656001 +0000 UTC m=+21.768101820" watchObservedRunningTime="2026-04-21 06:26:57.115703274 +0000 UTC m=+21.768149093" Apr 21 06:26:57.115875 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.115804 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gnb44" podStartSLOduration=3.808343931 podStartE2EDuration="21.115799298s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.48092637 +0000 UTC m=+3.133372173" lastFinishedPulling="2026-04-21 06:26:55.788381725 +0000 UTC m=+20.440827540" observedRunningTime="2026-04-21 06:26:57.103912183 +0000 UTC m=+21.756358002" watchObservedRunningTime="2026-04-21 06:26:57.115799298 +0000 UTC m=+21.768245118" Apr 21 06:26:57.366976 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.366949 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 06:26:57.858099 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.857983 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T06:26:57.366971087Z","UUID":"2b02aa95-75b2-40ea-85d0-d8c7a95a82a1","Handler":null,"Name":"","Endpoint":""} Apr 21 06:26:57.859946 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.859913 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 06:26:57.859946 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.859941 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 06:26:57.927768 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:57.927499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:57.927922 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:57.927872 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:26:58.068757 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:58.068704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" event={"ID":"1bb3c06d-f4bc-4567-9958-978a3b9398c2","Type":"ContainerStarted","Data":"1abc198df3082ec584232fd23c9c3b276e8ccc3cfe42b54144dba5b549d026a7"} Apr 21 06:26:58.070087 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:58.070046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dwt8n" event={"ID":"9f0218ab-e007-4a8a-a5d9-1682337a814f","Type":"ContainerStarted","Data":"7e12b3379f907b5d9d59c1595fe8feb2cdb19f0574689891da8bd77ef7f14d58"} Apr 21 06:26:58.084831 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:58.084779 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dwt8n" podStartSLOduration=4.787666784 podStartE2EDuration="22.084762963s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.472601993 +0000 UTC m=+3.125047789" lastFinishedPulling="2026-04-21 06:26:55.769698171 +0000 UTC m=+20.422143968" observedRunningTime="2026-04-21 06:26:58.084502438 +0000 UTC m=+22.736948259" watchObservedRunningTime="2026-04-21 06:26:58.084762963 +0000 UTC m=+22.737208786" Apr 21 06:26:58.809766 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:58.809550 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:58.810161 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:58.810136 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:58.926161 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:58.926126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:26:58.926319 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:58.926257 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:26:59.074236 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:59.074149 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" event={"ID":"1bb3c06d-f4bc-4567-9958-978a3b9398c2","Type":"ContainerStarted","Data":"726ee2007deb74a22b6dd6aa8f36ce952696464a9621fbf19bae4d66f353fe6e"} Apr 21 06:26:59.077383 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:59.077362 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:26:59.078259 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:59.078229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"37fca6f5d228ace590addc67bb2e198f4923f18dff0ef935d4ac0343db587ca7"} Apr 21 06:26:59.078372 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:59.078269 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:59.078690 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:59.078673 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6hkzv" Apr 21 06:26:59.090258 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:59.090213 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nsw5z" podStartSLOduration=4.022801164 podStartE2EDuration="24.090199234s" podCreationTimestamp="2026-04-21 06:26:35 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.481621274 +0000 UTC m=+3.134067085" lastFinishedPulling="2026-04-21 06:26:58.549019356 +0000 UTC m=+23.201465155" observedRunningTime="2026-04-21 06:26:59.089636571 +0000 UTC m=+23.742082391" watchObservedRunningTime="2026-04-21 06:26:59.090199234 +0000 UTC m=+23.742645054" Apr 21 06:26:59.926340 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:26:59.926314 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:26:59.926484 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:26:59.926462 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:27:00.925811 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:00.925778 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:00.926397 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:00.925907 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:27:01.926141 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:01.925884 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:27:01.926550 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:01.926259 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:27:02.084721 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.084696 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:27:02.085081 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.085058 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"d9f75a5e0af755ea2bc0b6e67898ec9d1cb14839026ca1a5ab74cb80be93056e"} Apr 21 06:27:02.085318 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.085297 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:27:02.085318 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.085328 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:27:02.085467 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.085341 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:27:02.085516 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.085474 2577 scope.go:117] "RemoveContainer" containerID="bce6a5209ddeff42ed82538ba9942907e23e339a46040b205d60f4702113a7d5" Apr 21 06:27:02.087028 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.087005 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a9984cb-8f3e-47c2-b7d5-3612ff658e70" containerID="3d52cd2f0b400423975d35bd356f8fbd06caa7b8824fc60e8712a70e6461aad9" exitCode=0 Apr 21 06:27:02.087120 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.087056 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerDied","Data":"3d52cd2f0b400423975d35bd356f8fbd06caa7b8824fc60e8712a70e6461aad9"} Apr 21 06:27:02.101598 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.101579 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:27:02.101685 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.101643 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:27:02.926142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:02.926103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:02.926588 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:02.926239 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:27:03.072172 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.072139 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-thvnj"] Apr 21 06:27:03.072858 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.072837 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-276tk"] Apr 21 06:27:03.072979 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.072966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:27:03.073090 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:03.073069 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:27:03.092806 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.092779 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:27:03.093155 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.093123 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" event={"ID":"be794aa6-58e2-4d4d-b76c-e85f84c36d7e","Type":"ContainerStarted","Data":"52b619bdb56d8a3ec2d3f30e7f8c69333845ff5e3dd72e1b06e0f6f99f30f08b"} Apr 21 06:27:03.095097 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.095060 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a9984cb-8f3e-47c2-b7d5-3612ff658e70" containerID="e60cdca51067fd0fe898c2a5d91d15e40f39a013d237d4bcea9a8ba8f31eee2f" exitCode=0 Apr 21 06:27:03.095186 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.095120 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerDied","Data":"e60cdca51067fd0fe898c2a5d91d15e40f39a013d237d4bcea9a8ba8f31eee2f"} Apr 21 06:27:03.095264 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.095249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:03.095373 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:03.095355 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:27:03.116851 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:03.116768 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" podStartSLOduration=9.699087259 podStartE2EDuration="27.116751986s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.469740645 +0000 UTC m=+3.122186455" lastFinishedPulling="2026-04-21 06:26:55.887405384 +0000 UTC m=+20.539851182" observedRunningTime="2026-04-21 06:27:03.115428663 +0000 UTC m=+27.767874482" watchObservedRunningTime="2026-04-21 06:27:03.116751986 +0000 UTC m=+27.769197796" Apr 21 06:27:04.099202 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:04.099168 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a9984cb-8f3e-47c2-b7d5-3612ff658e70" containerID="b2db2e2f2e850e82477ea993d5c70e098d0c382ce4c31edc524d84c685e95678" exitCode=0 Apr 21 06:27:04.099574 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:04.099247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerDied","Data":"b2db2e2f2e850e82477ea993d5c70e098d0c382ce4c31edc524d84c685e95678"} Apr 21 06:27:04.926020 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:04.925982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:27:04.926197 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:04.925982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:04.926197 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:04.926110 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:27:04.926308 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:04.926191 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:27:06.926247 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:06.926211 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:27:06.927022 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:06.926209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:06.927022 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:06.926368 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:27:06.927022 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:06.926396 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:27:08.926486 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:08.926414 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:08.927045 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:08.926414 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:27:08.927045 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:08.926522 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-thvnj" podUID="7f77e68e-f3ad-422e-af2d-685ee3a97eaa" Apr 21 06:27:08.927045 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:08.926631 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-276tk" podUID="14257089-c0ac-4007-81fc-ff9a9034e71b" Apr 21 06:27:09.196202 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.196126 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-55.ec2.internal" event="NodeReady" Apr 21 06:27:09.196363 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.196271 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 06:27:09.241302 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.241268 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rkcmp"] Apr 21 06:27:09.269386 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.269361 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2dlsg"] Apr 21 06:27:09.269562 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.269515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.271898 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.271614 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz466\"" Apr 21 06:27:09.271898 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.271629 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 06:27:09.271898 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.271649 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 06:27:09.284374 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.284338 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rkcmp"] Apr 21 06:27:09.284374 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.284366 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2dlsg"] Apr 21 06:27:09.284537 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.284478 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:09.286776 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.286758 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 06:27:09.286907 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.286790 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 06:27:09.286907 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.286764 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 06:27:09.287240 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.287218 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-98jt7\"" Apr 21 06:27:09.348111 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.348073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsk5b\" (UniqueName: \"kubernetes.io/projected/f85b9a72-4484-46f7-bab5-6a307b7bd43f-kube-api-access-gsk5b\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.348282 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.348123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f85b9a72-4484-46f7-bab5-6a307b7bd43f-tmp-dir\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.348282 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.348245 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f85b9a72-4484-46f7-bab5-6a307b7bd43f-config-volume\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.348377 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.348292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.449462 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.449374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f85b9a72-4484-46f7-bab5-6a307b7bd43f-tmp-dir\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.449462 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.449439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgpnx\" (UniqueName: \"kubernetes.io/projected/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-kube-api-access-sgpnx\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:09.449684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.449585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f85b9a72-4484-46f7-bab5-6a307b7bd43f-config-volume\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.449684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.449634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.449824 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.449695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:09.449824 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.449759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsk5b\" (UniqueName: \"kubernetes.io/projected/f85b9a72-4484-46f7-bab5-6a307b7bd43f-kube-api-access-gsk5b\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.449824 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.449792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f85b9a72-4484-46f7-bab5-6a307b7bd43f-tmp-dir\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.449966 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.449904 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:09.450014 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.449976 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls podName:f85b9a72-4484-46f7-bab5-6a307b7bd43f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:09.949952005 +0000 UTC m=+34.602397804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls") pod "dns-default-rkcmp" (UID: "f85b9a72-4484-46f7-bab5-6a307b7bd43f") : secret "dns-default-metrics-tls" not found Apr 21 06:27:09.450537 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.450505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f85b9a72-4484-46f7-bab5-6a307b7bd43f-config-volume\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.461056 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.461032 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsk5b\" (UniqueName: \"kubernetes.io/projected/f85b9a72-4484-46f7-bab5-6a307b7bd43f-kube-api-access-gsk5b\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.551008 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.550828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:09.551151 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.551038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgpnx\" (UniqueName: \"kubernetes.io/projected/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-kube-api-access-sgpnx\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:09.551151 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.550971 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:09.551151 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.551142 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert podName:3336a9c5-62bd-44a2-8149-ccbdebfdb50a nodeName:}" failed. No retries permitted until 2026-04-21 06:27:10.051125724 +0000 UTC m=+34.703571521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert") pod "ingress-canary-2dlsg" (UID: "3336a9c5-62bd-44a2-8149-ccbdebfdb50a") : secret "canary-serving-cert" not found Apr 21 06:27:09.560157 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.560137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgpnx\" (UniqueName: \"kubernetes.io/projected/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-kube-api-access-sgpnx\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:09.651528 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.651484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:27:09.651680 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.651622 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:27:09.651723 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.651688 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs podName:14257089-c0ac-4007-81fc-ff9a9034e71b nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.651673512 +0000 UTC m=+66.304119313 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs") pod "network-metrics-daemon-276tk" (UID: "14257089-c0ac-4007-81fc-ff9a9034e71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:27:09.752330 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.752304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:09.752516 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.752493 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:27:09.752567 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.752523 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:27:09.752567 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.752536 2577 projected.go:194] Error preparing data for projected volume kube-api-access-plqrx for pod openshift-network-diagnostics/network-check-target-thvnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:27:09.752637 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.752597 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx podName:7f77e68e-f3ad-422e-af2d-685ee3a97eaa nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.752579082 +0000 UTC m=+66.405024893 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-plqrx" (UniqueName: "kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx") pod "network-check-target-thvnj" (UID: "7f77e68e-f3ad-422e-af2d-685ee3a97eaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:27:09.954420 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:09.954397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:09.954754 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.954574 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:09.954754 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:09.954649 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls podName:f85b9a72-4484-46f7-bab5-6a307b7bd43f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:10.954628531 +0000 UTC m=+35.607074341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls") pod "dns-default-rkcmp" (UID: "f85b9a72-4484-46f7-bab5-6a307b7bd43f") : secret "dns-default-metrics-tls" not found Apr 21 06:27:10.055672 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.055648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:10.055801 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:10.055777 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:10.055844 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:10.055827 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert podName:3336a9c5-62bd-44a2-8149-ccbdebfdb50a nodeName:}" failed. No retries permitted until 2026-04-21 06:27:11.055815309 +0000 UTC m=+35.708261105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert") pod "ingress-canary-2dlsg" (UID: "3336a9c5-62bd-44a2-8149-ccbdebfdb50a") : secret "canary-serving-cert" not found Apr 21 06:27:10.114332 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.114293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerStarted","Data":"4aa87db266d9c076f10ee6d19b45a358ec4c4d3600669ad7ea1de7e28e515e49"} Apr 21 06:27:10.925777 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.925710 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:27:10.925946 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.925710 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:10.928294 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.928260 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 06:27:10.929232 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.929204 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 06:27:10.929337 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.929232 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nvgbf\"" Apr 21 06:27:10.929337 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.929244 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p6jr2\"" Apr 21 06:27:10.929337 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.929208 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 06:27:10.962791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:10.962766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:10.963088 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:10.962903 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:10.963088 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:10.962957 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls podName:f85b9a72-4484-46f7-bab5-6a307b7bd43f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:12.962942245 +0000 UTC m=+37.615388042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls") pod "dns-default-rkcmp" (UID: "f85b9a72-4484-46f7-bab5-6a307b7bd43f") : secret "dns-default-metrics-tls" not found Apr 21 06:27:11.063262 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:11.063235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:11.063389 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:11.063372 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:11.063439 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:11.063428 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert podName:3336a9c5-62bd-44a2-8149-ccbdebfdb50a nodeName:}" failed. No retries permitted until 2026-04-21 06:27:13.063413821 +0000 UTC m=+37.715859619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert") pod "ingress-canary-2dlsg" (UID: "3336a9c5-62bd-44a2-8149-ccbdebfdb50a") : secret "canary-serving-cert" not found Apr 21 06:27:11.118455 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:11.118425 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a9984cb-8f3e-47c2-b7d5-3612ff658e70" containerID="4aa87db266d9c076f10ee6d19b45a358ec4c4d3600669ad7ea1de7e28e515e49" exitCode=0 Apr 21 06:27:11.118592 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:11.118492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerDied","Data":"4aa87db266d9c076f10ee6d19b45a358ec4c4d3600669ad7ea1de7e28e515e49"} Apr 21 06:27:12.122692 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:12.122661 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a9984cb-8f3e-47c2-b7d5-3612ff658e70" containerID="d87b9917c8c84be001ccd2b5f6fb4d551a257dc608db79d7ccecbb38c45adf0e" exitCode=0 Apr 21 06:27:12.123071 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:12.122720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerDied","Data":"d87b9917c8c84be001ccd2b5f6fb4d551a257dc608db79d7ccecbb38c45adf0e"} Apr 21 06:27:12.976754 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:12.976689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:12.976934 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:12.976845 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:12.976934 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:12.976910 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls podName:f85b9a72-4484-46f7-bab5-6a307b7bd43f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:16.976895863 +0000 UTC m=+41.629341660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls") pod "dns-default-rkcmp" (UID: "f85b9a72-4484-46f7-bab5-6a307b7bd43f") : secret "dns-default-metrics-tls" not found Apr 21 06:27:13.077425 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:13.077389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:13.077569 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:13.077534 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:13.077607 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:13.077592 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert podName:3336a9c5-62bd-44a2-8149-ccbdebfdb50a nodeName:}" failed. No retries permitted until 2026-04-21 06:27:17.077575187 +0000 UTC m=+41.730020984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert") pod "ingress-canary-2dlsg" (UID: "3336a9c5-62bd-44a2-8149-ccbdebfdb50a") : secret "canary-serving-cert" not found Apr 21 06:27:13.127742 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:13.127693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7vk77" event={"ID":"5a9984cb-8f3e-47c2-b7d5-3612ff658e70","Type":"ContainerStarted","Data":"3d208b70a01441f8b398aa36afce21e33d77a88119d54983f24dc3434f43fb49"} Apr 21 06:27:13.149386 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:13.149326 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7vk77" podStartSLOduration=5.69482635 podStartE2EDuration="37.14931042s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:26:38.478278538 +0000 UTC m=+3.130724337" lastFinishedPulling="2026-04-21 06:27:09.932762606 +0000 UTC m=+34.585208407" observedRunningTime="2026-04-21 06:27:13.1476893 +0000 UTC m=+37.800135131" watchObservedRunningTime="2026-04-21 06:27:13.14931042 +0000 UTC m=+37.801756241" Apr 21 06:27:17.006833 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:17.006798 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:17.007199 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:17.006962 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:17.007199 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:17.007029 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls podName:f85b9a72-4484-46f7-bab5-6a307b7bd43f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:25.007012493 +0000 UTC m=+49.659458290 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls") pod "dns-default-rkcmp" (UID: "f85b9a72-4484-46f7-bab5-6a307b7bd43f") : secret "dns-default-metrics-tls" not found Apr 21 06:27:17.107805 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:17.107775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:17.107952 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:17.107928 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:17.108012 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:17.108003 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert podName:3336a9c5-62bd-44a2-8149-ccbdebfdb50a nodeName:}" failed. No retries permitted until 2026-04-21 06:27:25.107985068 +0000 UTC m=+49.760430865 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert") pod "ingress-canary-2dlsg" (UID: "3336a9c5-62bd-44a2-8149-ccbdebfdb50a") : secret "canary-serving-cert" not found Apr 21 06:27:25.063452 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:25.063408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:25.063993 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:25.063523 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:25.063993 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:25.063576 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls podName:f85b9a72-4484-46f7-bab5-6a307b7bd43f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.063562089 +0000 UTC m=+65.716007886 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls") pod "dns-default-rkcmp" (UID: "f85b9a72-4484-46f7-bab5-6a307b7bd43f") : secret "dns-default-metrics-tls" not found Apr 21 06:27:25.164441 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:25.164408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:25.164613 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:25.164552 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:25.164613 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:25.164608 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert podName:3336a9c5-62bd-44a2-8149-ccbdebfdb50a nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.164593257 +0000 UTC m=+65.817039056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert") pod "ingress-canary-2dlsg" (UID: "3336a9c5-62bd-44a2-8149-ccbdebfdb50a") : secret "canary-serving-cert" not found Apr 21 06:27:34.109681 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:34.109649 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bdm62" Apr 21 06:27:40.879624 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.879585 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp"] Apr 21 06:27:40.929369 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.929333 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp"] Apr 21 06:27:40.929542 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.929474 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:40.931971 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.931950 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 06:27:40.932127 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.931972 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:27:40.932127 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.931996 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-5z8r4\"" Apr 21 06:27:40.932890 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.932871 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 06:27:40.933014 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.932944 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 06:27:40.975762 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.975711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/504b76b3-d116-4731-aca7-01cb1970de58-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:40.975953 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.975805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/504b76b3-d116-4731-aca7-01cb1970de58-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:40.975953 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.975835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llmpr\" (UniqueName: \"kubernetes.io/projected/504b76b3-d116-4731-aca7-01cb1970de58-kube-api-access-llmpr\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:40.982395 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:40.982365 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd"] Apr 21 06:27:41.004305 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.004272 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc"] Apr 21 06:27:41.004462 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.004417 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.006884 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.006860 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 06:27:41.007831 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.007803 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-88l6t\"" Apr 21 06:27:41.007950 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.007862 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 06:27:41.007950 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.007807 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 06:27:41.008079 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.007890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 06:27:41.018653 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.018627 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fdnr4"] Apr 21 06:27:41.018808 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.018793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:41.020922 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.020903 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 06:27:41.020922 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.020916 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 06:27:41.021091 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.020916 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:27:41.021091 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.020944 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-n977b\"" Apr 21 06:27:41.039627 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.039601 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5hfwt"] Apr 21 06:27:41.039778 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.039768 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.043213 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.043197 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-hl8km\"" Apr 21 06:27:41.043652 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.043631 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 06:27:41.043771 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.043668 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 06:27:41.043926 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.043910 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 06:27:41.043976 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.043911 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:27:41.050123 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.050105 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 06:27:41.057385 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.057366 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5968f9cfc4-wxd8m"] Apr 21 06:27:41.057526 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.057512 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.060800 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.060781 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-fgfgv\"" Apr 21 06:27:41.061118 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.061104 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 06:27:41.061218 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.061203 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 06:27:41.061507 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.061495 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 06:27:41.061662 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.061648 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 06:27:41.072884 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.072866 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 06:27:41.076094 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llmpr\" (UniqueName: \"kubernetes.io/projected/504b76b3-d116-4731-aca7-01cb1970de58-kube-api-access-llmpr\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:41.076171 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6803da32-a76e-4d0e-916c-a12f322ff600-trusted-ca\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.076171 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.076171 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-tmp\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.076171 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82g7d\" (UniqueName: \"kubernetes.io/projected/6803da32-a76e-4d0e-916c-a12f322ff600-kube-api-access-82g7d\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.076324 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:27:41.076324 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076226 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/634bd9b2-8299-43f0-9124-eb65af43af1e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.076324 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.076324 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfzzx\" (UniqueName: \"kubernetes.io/projected/634bd9b2-8299-43f0-9124-eb65af43af1e-kube-api-access-xfzzx\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.076324 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-snapshots\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.076324 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/504b76b3-d116-4731-aca7-01cb1970de58-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:41.076324 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.076293 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:41.076547 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.076397 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls podName:f85b9a72-4484-46f7-bab5-6a307b7bd43f nodeName:}" failed. No retries permitted until 2026-04-21 06:28:13.076379206 +0000 UTC m=+97.728825012 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls") pod "dns-default-rkcmp" (UID: "f85b9a72-4484-46f7-bab5-6a307b7bd43f") : secret "dns-default-metrics-tls" not found Apr 21 06:27:41.076547 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:41.076547 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bpzl\" (UniqueName: \"kubernetes.io/projected/50263a7c-1596-4353-a40d-4453e307fb4f-kube-api-access-2bpzl\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:41.076547 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6803da32-a76e-4d0e-916c-a12f322ff600-config\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.076691 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-service-ca-bundle\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.076691 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076586 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cptb\" (UniqueName: \"kubernetes.io/projected/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-kube-api-access-7cptb\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.076691 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/504b76b3-d116-4731-aca7-01cb1970de58-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:41.076691 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-serving-cert\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.076860 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.076816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6803da32-a76e-4d0e-916c-a12f322ff600-serving-cert\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.077070 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.077055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/504b76b3-d116-4731-aca7-01cb1970de58-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:41.079553 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.079532 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/504b76b3-d116-4731-aca7-01cb1970de58-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:41.081391 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.081370 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd"] Apr 21 06:27:41.081391 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.081392 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc"] Apr 21 06:27:41.081523 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.081401 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fdnr4"] Apr 21 06:27:41.081523 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.081409 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5hfwt"] Apr 21 06:27:41.081523 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.081418 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5968f9cfc4-wxd8m"] Apr 21 06:27:41.081523 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.081505 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.083654 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.083626 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 06:27:41.083654 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.083650 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 06:27:41.083974 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.083953 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-cpdrh\"" Apr 21 06:27:41.084058 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.084010 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 06:27:41.084147 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.084133 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 06:27:41.084203 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.084190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 06:27:41.084303 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.084289 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 06:27:41.090176 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.090158 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr"] Apr 21 06:27:41.096004 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.095986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llmpr\" (UniqueName: \"kubernetes.io/projected/504b76b3-d116-4731-aca7-01cb1970de58-kube-api-access-llmpr\") pod \"kube-storage-version-migrator-operator-6769c5d45-j2sxp\" (UID: \"504b76b3-d116-4731-aca7-01cb1970de58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:41.114569 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.114543 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7579d96757-p2wbq"] Apr 21 06:27:41.114759 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.114704 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr" Apr 21 06:27:41.116965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.116946 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 06:27:41.117089 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.116947 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:27:41.117089 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.117039 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-2v5kc\"" Apr 21 06:27:41.139703 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.139649 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr"] Apr 21 06:27:41.139703 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.139673 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7579d96757-p2wbq"] Apr 21 06:27:41.139876 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.139780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.141867 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.141847 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 06:27:41.142035 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.142021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9wmrl\"" Apr 21 06:27:41.142113 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.142097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 06:27:41.142167 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.142101 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 06:27:41.147088 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.147070 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 06:27:41.177984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.177949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.177984 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.177988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpfhd\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-kube-api-access-xpfhd\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.178206 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.178206 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8hxj\" (UniqueName: \"kubernetes.io/projected/58c896dd-85e3-47f6-a9db-a8d9d4542bf1-kube-api-access-t8hxj\") pod \"volume-data-source-validator-7c6cbb6c87-mshlr\" (UID: \"58c896dd-85e3-47f6-a9db-a8d9d4542bf1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr" Apr 21 06:27:41.178206 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-registry-certificates\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.178206 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178146 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-tmp\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.178407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.178407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/634bd9b2-8299-43f0-9124-eb65af43af1e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.178407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178278 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.178407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-snapshots\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.178407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:41.178407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178367 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-bound-sa-token\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.178407 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcl8x\" (UniqueName: \"kubernetes.io/projected/d743fd44-3762-47ee-9a4c-617f122ba333-kube-api-access-jcl8x\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.178639 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.178639 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.178431 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:41.178639 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-tmp\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.178639 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-default-certificate\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.178639 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.178502 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls podName:634bd9b2-8299-43f0-9124-eb65af43af1e nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.67848246 +0000 UTC m=+66.330928257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmsbd" (UID: "634bd9b2-8299-43f0-9124-eb65af43af1e") : secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:41.178639 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.178601 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 06:27:41.178876 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.178689 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls podName:50263a7c-1596-4353-a40d-4453e307fb4f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.678670464 +0000 UTC m=+66.331116260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2g6hc" (UID: "50263a7c-1596-4353-a40d-4453e307fb4f") : secret "samples-operator-tls" not found Apr 21 06:27:41.178876 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6803da32-a76e-4d0e-916c-a12f322ff600-serving-cert\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.178876 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6803da32-a76e-4d0e-916c-a12f322ff600-trusted-ca\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.178876 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178795 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20805057-b0bc-4289-a705-2e946efadc98-ca-trust-extracted\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.178876 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-installation-pull-secrets\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.178876 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g7d\" (UniqueName: \"kubernetes.io/projected/6803da32-a76e-4d0e-916c-a12f322ff600-kube-api-access-82g7d\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfzzx\" (UniqueName: \"kubernetes.io/projected/634bd9b2-8299-43f0-9124-eb65af43af1e-kube-api-access-xfzzx\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bpzl\" (UniqueName: \"kubernetes.io/projected/50263a7c-1596-4353-a40d-4453e307fb4f-kube-api-access-2bpzl\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-snapshots\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.178965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-image-registry-private-configuration\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179016 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-trusted-ca\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/634bd9b2-8299-43f0-9124-eb65af43af1e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.179100 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6803da32-a76e-4d0e-916c-a12f322ff600-config\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.179136 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert podName:3336a9c5-62bd-44a2-8149-ccbdebfdb50a nodeName:}" failed. No retries permitted until 2026-04-21 06:28:13.179124158 +0000 UTC m=+97.831569959 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert") pod "ingress-canary-2dlsg" (UID: "3336a9c5-62bd-44a2-8149-ccbdebfdb50a") : secret "canary-serving-cert" not found Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-service-ca-bundle\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.179180 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cptb\" (UniqueName: \"kubernetes.io/projected/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-kube-api-access-7cptb\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.179834 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179200 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-serving-cert\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.179834 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179222 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-stats-auth\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.179834 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-service-ca-bundle\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.179834 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6803da32-a76e-4d0e-916c-a12f322ff600-trusted-ca\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.179960 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.179862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6803da32-a76e-4d0e-916c-a12f322ff600-config\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.181468 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.181447 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6803da32-a76e-4d0e-916c-a12f322ff600-serving-cert\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.181577 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.181561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-serving-cert\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.187430 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.187401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bpzl\" (UniqueName: \"kubernetes.io/projected/50263a7c-1596-4353-a40d-4453e307fb4f-kube-api-access-2bpzl\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:41.187551 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.187512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfzzx\" (UniqueName: \"kubernetes.io/projected/634bd9b2-8299-43f0-9124-eb65af43af1e-kube-api-access-xfzzx\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.187709 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.187689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82g7d\" (UniqueName: \"kubernetes.io/projected/6803da32-a76e-4d0e-916c-a12f322ff600-kube-api-access-82g7d\") pod \"console-operator-9d4b6777b-fdnr4\" (UID: \"6803da32-a76e-4d0e-916c-a12f322ff600\") " pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.187982 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.187964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cptb\" (UniqueName: \"kubernetes.io/projected/fb78ba44-67d7-4a52-b661-9a1c6e9c6b38-kube-api-access-7cptb\") pod \"insights-operator-585dfdc468-5hfwt\" (UID: \"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38\") " pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.237959 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.237927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" Apr 21 06:27:41.280468 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20805057-b0bc-4289-a705-2e946efadc98-ca-trust-extracted\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.280468 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-installation-pull-secrets\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.280468 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-image-registry-private-configuration\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.280767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-trusted-ca\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.280767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-stats-auth\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.280767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpfhd\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-kube-api-access-xpfhd\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.280767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.280767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8hxj\" (UniqueName: \"kubernetes.io/projected/58c896dd-85e3-47f6-a9db-a8d9d4542bf1-kube-api-access-t8hxj\") pod \"volume-data-source-validator-7c6cbb6c87-mshlr\" (UID: \"58c896dd-85e3-47f6-a9db-a8d9d4542bf1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr" Apr 21 06:27:41.280767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280599 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-registry-certificates\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.280767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.280767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20805057-b0bc-4289-a705-2e946efadc98-ca-trust-extracted\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.281124 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.280836 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 06:27:41.281124 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.280903 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.78088466 +0000 UTC m=+66.433330464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : secret "router-metrics-certs-default" not found Apr 21 06:27:41.281124 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280945 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-bound-sa-token\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.281124 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.280973 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcl8x\" (UniqueName: \"kubernetes.io/projected/d743fd44-3762-47ee-9a4c-617f122ba333-kube-api-access-jcl8x\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.281124 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.281000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.281124 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.281033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-default-certificate\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.281414 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.281248 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.781216319 +0000 UTC m=+66.433662138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : configmap references non-existent config key: service-ca.crt Apr 21 06:27:41.282502 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.281568 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:27:41.282502 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.281587 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7579d96757-p2wbq: secret "image-registry-tls" not found Apr 21 06:27:41.282502 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.281634 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls podName:20805057-b0bc-4289-a705-2e946efadc98 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:41.781619036 +0000 UTC m=+66.434064845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls") pod "image-registry-7579d96757-p2wbq" (UID: "20805057-b0bc-4289-a705-2e946efadc98") : secret "image-registry-tls" not found Apr 21 06:27:41.282844 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.282620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-registry-certificates\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.282844 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.282751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-trusted-ca\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.283400 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.283372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-installation-pull-secrets\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.283662 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.283638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-stats-auth\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.283820 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.283792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-image-registry-private-configuration\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.284074 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.284052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-default-certificate\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.289895 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.289876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcl8x\" (UniqueName: \"kubernetes.io/projected/d743fd44-3762-47ee-9a4c-617f122ba333-kube-api-access-jcl8x\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.290131 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.290110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8hxj\" (UniqueName: \"kubernetes.io/projected/58c896dd-85e3-47f6-a9db-a8d9d4542bf1-kube-api-access-t8hxj\") pod \"volume-data-source-validator-7c6cbb6c87-mshlr\" (UID: \"58c896dd-85e3-47f6-a9db-a8d9d4542bf1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr" Apr 21 06:27:41.290400 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.290384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-bound-sa-token\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.290659 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.290642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpfhd\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-kube-api-access-xpfhd\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.348549 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.348511 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:41.367194 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.365968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5hfwt" Apr 21 06:27:41.419239 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.419152 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp"] Apr 21 06:27:41.422803 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:27:41.422772 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod504b76b3_d116_4731_aca7_01cb1970de58.slice/crio-228f3a2b60e2c2631fd719e0ac63365adfc811e0a716398345c5c5f5304fd1e6 WatchSource:0}: Error finding container 228f3a2b60e2c2631fd719e0ac63365adfc811e0a716398345c5c5f5304fd1e6: Status 404 returned error can't find the container with id 228f3a2b60e2c2631fd719e0ac63365adfc811e0a716398345c5c5f5304fd1e6 Apr 21 06:27:41.422918 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.422812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr" Apr 21 06:27:41.505955 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.505902 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fdnr4"] Apr 21 06:27:41.508987 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:27:41.508944 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6803da32_a76e_4d0e_916c_a12f322ff600.slice/crio-ccba95f02920a9a3ebd0a522ef7e9ac5728ff7ae54046a580b9eed112d2eda64 WatchSource:0}: Error finding container ccba95f02920a9a3ebd0a522ef7e9ac5728ff7ae54046a580b9eed112d2eda64: Status 404 returned error can't find the container with id ccba95f02920a9a3ebd0a522ef7e9ac5728ff7ae54046a580b9eed112d2eda64 Apr 21 06:27:41.523177 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.523146 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5hfwt"] Apr 21 06:27:41.530360 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:27:41.530334 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb78ba44_67d7_4a52_b661_9a1c6e9c6b38.slice/crio-af21b994cd8345cf37db77cd6d2c9cb0eaae126c55a42e55f9af5de141e4054a WatchSource:0}: Error finding container af21b994cd8345cf37db77cd6d2c9cb0eaae126c55a42e55f9af5de141e4054a: Status 404 returned error can't find the container with id af21b994cd8345cf37db77cd6d2c9cb0eaae126c55a42e55f9af5de141e4054a Apr 21 06:27:41.556698 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.556670 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr"] Apr 21 06:27:41.559242 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:27:41.559219 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c896dd_85e3_47f6_a9db_a8d9d4542bf1.slice/crio-1df52f756bf7e3e3d5e11fd9ac6cf264f6a5a1d82e70bfa2217b5d43172a7904 WatchSource:0}: Error finding container 1df52f756bf7e3e3d5e11fd9ac6cf264f6a5a1d82e70bfa2217b5d43172a7904: Status 404 returned error can't find the container with id 1df52f756bf7e3e3d5e11fd9ac6cf264f6a5a1d82e70bfa2217b5d43172a7904 Apr 21 06:27:41.685158 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.685074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:27:41.685158 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.685142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:41.685339 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.685177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:41.685339 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.685231 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:41.685339 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.685271 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 06:27:41.685339 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.685294 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls podName:634bd9b2-8299-43f0-9124-eb65af43af1e nodeName:}" failed. No retries permitted until 2026-04-21 06:27:42.685279347 +0000 UTC m=+67.337725143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmsbd" (UID: "634bd9b2-8299-43f0-9124-eb65af43af1e") : secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:41.685339 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.685314 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls podName:50263a7c-1596-4353-a40d-4453e307fb4f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:42.685302424 +0000 UTC m=+67.337748222 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2g6hc" (UID: "50263a7c-1596-4353-a40d-4453e307fb4f") : secret "samples-operator-tls" not found Apr 21 06:27:41.687423 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.687407 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 06:27:41.696137 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.696118 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 06:27:41.696250 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.696168 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs podName:14257089-c0ac-4007-81fc-ff9a9034e71b nodeName:}" failed. No retries permitted until 2026-04-21 06:28:45.696152907 +0000 UTC m=+130.348598703 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs") pod "network-metrics-daemon-276tk" (UID: "14257089-c0ac-4007-81fc-ff9a9034e71b") : secret "metrics-daemon-secret" not found Apr 21 06:27:41.785934 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.785884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:41.786126 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.785981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.786126 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.786016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:41.786126 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.786077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:41.786289 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.786132 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:42.786111334 +0000 UTC m=+67.438557134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : configmap references non-existent config key: service-ca.crt Apr 21 06:27:41.786289 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.786156 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 06:27:41.786289 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.786185 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:27:41.786289 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.786199 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7579d96757-p2wbq: secret "image-registry-tls" not found Apr 21 06:27:41.786289 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.786206 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:42.786194593 +0000 UTC m=+67.438640389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : secret "router-metrics-certs-default" not found Apr 21 06:27:41.786289 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:41.786240 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls podName:20805057-b0bc-4289-a705-2e946efadc98 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:42.786225389 +0000 UTC m=+67.438671206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls") pod "image-registry-7579d96757-p2wbq" (UID: "20805057-b0bc-4289-a705-2e946efadc98") : secret "image-registry-tls" not found Apr 21 06:27:41.788228 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.788212 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 06:27:41.798024 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.798004 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 06:27:41.809145 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.809123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqrx\" (UniqueName: \"kubernetes.io/projected/7f77e68e-f3ad-422e-af2d-685ee3a97eaa-kube-api-access-plqrx\") pod \"network-check-target-thvnj\" (UID: \"7f77e68e-f3ad-422e-af2d-685ee3a97eaa\") " pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:41.842157 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.842129 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nvgbf\"" Apr 21 06:27:41.850986 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.850965 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:41.982511 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:41.982468 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-thvnj"] Apr 21 06:27:41.985874 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:27:41.985839 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f77e68e_f3ad_422e_af2d_685ee3a97eaa.slice/crio-3cc410d20bf9c9dec0f2311c90151217b3593f612441c6e7d7c8b9c56fe8bbea WatchSource:0}: Error finding container 3cc410d20bf9c9dec0f2311c90151217b3593f612441c6e7d7c8b9c56fe8bbea: Status 404 returned error can't find the container with id 3cc410d20bf9c9dec0f2311c90151217b3593f612441c6e7d7c8b9c56fe8bbea Apr 21 06:27:42.185669 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.185559 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-thvnj" event={"ID":"7f77e68e-f3ad-422e-af2d-685ee3a97eaa","Type":"ContainerStarted","Data":"3cc410d20bf9c9dec0f2311c90151217b3593f612441c6e7d7c8b9c56fe8bbea"} Apr 21 06:27:42.186810 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.186768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr" event={"ID":"58c896dd-85e3-47f6-a9db-a8d9d4542bf1","Type":"ContainerStarted","Data":"1df52f756bf7e3e3d5e11fd9ac6cf264f6a5a1d82e70bfa2217b5d43172a7904"} Apr 21 06:27:42.189568 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.189532 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" event={"ID":"6803da32-a76e-4d0e-916c-a12f322ff600","Type":"ContainerStarted","Data":"ccba95f02920a9a3ebd0a522ef7e9ac5728ff7ae54046a580b9eed112d2eda64"} Apr 21 06:27:42.191472 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.191447 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" event={"ID":"504b76b3-d116-4731-aca7-01cb1970de58","Type":"ContainerStarted","Data":"228f3a2b60e2c2631fd719e0ac63365adfc811e0a716398345c5c5f5304fd1e6"} Apr 21 06:27:42.208120 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.208068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5hfwt" event={"ID":"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38","Type":"ContainerStarted","Data":"af21b994cd8345cf37db77cd6d2c9cb0eaae126c55a42e55f9af5de141e4054a"} Apr 21 06:27:42.696076 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.696029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:42.696264 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.696099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:42.696327 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.696281 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 06:27:42.696388 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.696348 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls podName:50263a7c-1596-4353-a40d-4453e307fb4f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:44.696327819 +0000 UTC m=+69.348773619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2g6hc" (UID: "50263a7c-1596-4353-a40d-4453e307fb4f") : secret "samples-operator-tls" not found Apr 21 06:27:42.696926 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.696801 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:42.696926 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.696858 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls podName:634bd9b2-8299-43f0-9124-eb65af43af1e nodeName:}" failed. No retries permitted until 2026-04-21 06:27:44.696842672 +0000 UTC m=+69.349288470 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmsbd" (UID: "634bd9b2-8299-43f0-9124-eb65af43af1e") : secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.797210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.797279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:42.797385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.797405 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:44.797380482 +0000 UTC m=+69.449826304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : configmap references non-existent config key: service-ca.crt Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.797486 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.797502 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7579d96757-p2wbq: secret "image-registry-tls" not found Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.797550 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls podName:20805057-b0bc-4289-a705-2e946efadc98 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:44.797532403 +0000 UTC m=+69.449978213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls") pod "image-registry-7579d96757-p2wbq" (UID: "20805057-b0bc-4289-a705-2e946efadc98") : secret "image-registry-tls" not found Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.797616 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 06:27:42.797672 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:42.797649 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:44.797637952 +0000 UTC m=+69.450083756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : secret "router-metrics-certs-default" not found Apr 21 06:27:44.715626 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:44.715584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:44.716103 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:44.715650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:44.716103 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.715755 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:44.716103 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.715834 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 06:27:44.716103 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.715838 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls podName:634bd9b2-8299-43f0-9124-eb65af43af1e nodeName:}" failed. No retries permitted until 2026-04-21 06:27:48.715813866 +0000 UTC m=+73.368259671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmsbd" (UID: "634bd9b2-8299-43f0-9124-eb65af43af1e") : secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:44.716103 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.715912 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls podName:50263a7c-1596-4353-a40d-4453e307fb4f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:48.715895109 +0000 UTC m=+73.368340906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2g6hc" (UID: "50263a7c-1596-4353-a40d-4453e307fb4f") : secret "samples-operator-tls" not found Apr 21 06:27:44.816697 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:44.816664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:44.816869 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:44.816744 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:44.816869 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:44.816809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:44.816869 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.816840 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 06:27:44.816969 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.816887 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:27:44.816969 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.816899 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:48.816883568 +0000 UTC m=+73.469329365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : secret "router-metrics-certs-default" not found Apr 21 06:27:44.816969 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.816902 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7579d96757-p2wbq: secret "image-registry-tls" not found Apr 21 06:27:44.816969 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.816914 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:48.816906925 +0000 UTC m=+73.469352722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : configmap references non-existent config key: service-ca.crt Apr 21 06:27:44.816969 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:44.816937 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls podName:20805057-b0bc-4289-a705-2e946efadc98 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:48.81692506 +0000 UTC m=+73.469370861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls") pod "image-registry-7579d96757-p2wbq" (UID: "20805057-b0bc-4289-a705-2e946efadc98") : secret "image-registry-tls" not found Apr 21 06:27:46.218553 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.218524 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/0.log" Apr 21 06:27:46.219009 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.218563 2577 generic.go:358] "Generic (PLEG): container finished" podID="6803da32-a76e-4d0e-916c-a12f322ff600" containerID="7a9c0ea9dc0d03038411380c78d1b0188f585e01cda7921a2d3b693ddfde04f0" exitCode=255 Apr 21 06:27:46.219009 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.218599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" event={"ID":"6803da32-a76e-4d0e-916c-a12f322ff600","Type":"ContainerDied","Data":"7a9c0ea9dc0d03038411380c78d1b0188f585e01cda7921a2d3b693ddfde04f0"} Apr 21 06:27:46.219009 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.218973 2577 scope.go:117] "RemoveContainer" containerID="7a9c0ea9dc0d03038411380c78d1b0188f585e01cda7921a2d3b693ddfde04f0" Apr 21 06:27:46.220134 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.220112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" event={"ID":"504b76b3-d116-4731-aca7-01cb1970de58","Type":"ContainerStarted","Data":"7bce394258ad157e5de24e5dfb555c72a548d4d9a2b61590f0f7c3ef0f887150"} Apr 21 06:27:46.221606 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.221580 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5hfwt" event={"ID":"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38","Type":"ContainerStarted","Data":"5cc20b7e998117798c3045958aea75f18d6c4fbf6df0c8d5569b32f73bf17f95"} Apr 21 06:27:46.223131 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.223113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-thvnj" event={"ID":"7f77e68e-f3ad-422e-af2d-685ee3a97eaa","Type":"ContainerStarted","Data":"dde6a16d7c43f95910b97d642172d2787aef95dbeb263460ebdce8d19a564fdf"} Apr 21 06:27:46.223249 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.223236 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:27:46.224532 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.224510 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr" event={"ID":"58c896dd-85e3-47f6-a9db-a8d9d4542bf1","Type":"ContainerStarted","Data":"e786a8b103f992eea2dbe7ac95c7b7874b993945ac4dc9c065d68e1f49327209"} Apr 21 06:27:46.247208 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.247157 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" podStartSLOduration=1.863768514 podStartE2EDuration="6.247140468s" podCreationTimestamp="2026-04-21 06:27:40 +0000 UTC" firstStartedPulling="2026-04-21 06:27:41.428986891 +0000 UTC m=+66.081432694" lastFinishedPulling="2026-04-21 06:27:45.81235885 +0000 UTC m=+70.464804648" observedRunningTime="2026-04-21 06:27:46.245603797 +0000 UTC m=+70.898049617" watchObservedRunningTime="2026-04-21 06:27:46.247140468 +0000 UTC m=+70.899586291" Apr 21 06:27:46.260215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.260162 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-5hfwt" podStartSLOduration=1.9788523 podStartE2EDuration="6.260140792s" podCreationTimestamp="2026-04-21 06:27:40 +0000 UTC" firstStartedPulling="2026-04-21 06:27:41.532285356 +0000 UTC m=+66.184731152" lastFinishedPulling="2026-04-21 06:27:45.813573846 +0000 UTC m=+70.466019644" observedRunningTime="2026-04-21 06:27:46.258617676 +0000 UTC m=+70.911063511" watchObservedRunningTime="2026-04-21 06:27:46.260140792 +0000 UTC m=+70.912586613" Apr 21 06:27:46.273199 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.273145 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-thvnj" podStartSLOduration=66.400903222 podStartE2EDuration="1m10.273127831s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:27:41.988341265 +0000 UTC m=+66.640787062" lastFinishedPulling="2026-04-21 06:27:45.860565873 +0000 UTC m=+70.513011671" observedRunningTime="2026-04-21 06:27:46.271604242 +0000 UTC m=+70.924050062" watchObservedRunningTime="2026-04-21 06:27:46.273127831 +0000 UTC m=+70.925573654" Apr 21 06:27:46.285505 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:46.285456 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mshlr" podStartSLOduration=1.035915436 podStartE2EDuration="5.285437934s" podCreationTimestamp="2026-04-21 06:27:41 +0000 UTC" firstStartedPulling="2026-04-21 06:27:41.560959034 +0000 UTC m=+66.213404831" lastFinishedPulling="2026-04-21 06:27:45.810481532 +0000 UTC m=+70.462927329" observedRunningTime="2026-04-21 06:27:46.28467275 +0000 UTC m=+70.937118571" watchObservedRunningTime="2026-04-21 06:27:46.285437934 +0000 UTC m=+70.937883756" Apr 21 06:27:47.228517 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:47.228483 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/1.log" Apr 21 06:27:47.229001 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:47.228904 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/0.log" Apr 21 06:27:47.229001 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:47.228935 2577 generic.go:358] "Generic (PLEG): container finished" podID="6803da32-a76e-4d0e-916c-a12f322ff600" containerID="00adaeb9b30731207b4ad824d81652851f2f4938c1f152b5f16e92ebb1c92d21" exitCode=255 Apr 21 06:27:47.229079 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:47.229047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" event={"ID":"6803da32-a76e-4d0e-916c-a12f322ff600","Type":"ContainerDied","Data":"00adaeb9b30731207b4ad824d81652851f2f4938c1f152b5f16e92ebb1c92d21"} Apr 21 06:27:47.229132 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:47.229092 2577 scope.go:117] "RemoveContainer" containerID="7a9c0ea9dc0d03038411380c78d1b0188f585e01cda7921a2d3b693ddfde04f0" Apr 21 06:27:47.229314 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:47.229290 2577 scope.go:117] "RemoveContainer" containerID="00adaeb9b30731207b4ad824d81652851f2f4938c1f152b5f16e92ebb1c92d21" Apr 21 06:27:47.229531 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:47.229510 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdnr4_openshift-console-operator(6803da32-a76e-4d0e-916c-a12f322ff600)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" podUID="6803da32-a76e-4d0e-916c-a12f322ff600" Apr 21 06:27:48.232412 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:48.232381 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/1.log" Apr 21 06:27:48.232822 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:48.232749 2577 scope.go:117] "RemoveContainer" containerID="00adaeb9b30731207b4ad824d81652851f2f4938c1f152b5f16e92ebb1c92d21" Apr 21 06:27:48.232960 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.232942 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdnr4_openshift-console-operator(6803da32-a76e-4d0e-916c-a12f322ff600)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" podUID="6803da32-a76e-4d0e-916c-a12f322ff600" Apr 21 06:27:48.696475 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:48.696403 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2hplw_83b7a9cf-9462-4e2a-901b-482dc68cb898/dns-node-resolver/0.log" Apr 21 06:27:48.752243 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:48.752209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:48.752392 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:48.752250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:48.752392 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.752343 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 06:27:48.752392 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.752345 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:48.752508 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.752423 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls podName:634bd9b2-8299-43f0-9124-eb65af43af1e nodeName:}" failed. No retries permitted until 2026-04-21 06:27:56.752407959 +0000 UTC m=+81.404853756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmsbd" (UID: "634bd9b2-8299-43f0-9124-eb65af43af1e") : secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:48.752508 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.752438 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls podName:50263a7c-1596-4353-a40d-4453e307fb4f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:56.752430948 +0000 UTC m=+81.404876745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2g6hc" (UID: "50263a7c-1596-4353-a40d-4453e307fb4f") : secret "samples-operator-tls" not found Apr 21 06:27:48.853369 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:48.853329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:48.853530 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:48.853395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:48.853530 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.853477 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 06:27:48.853530 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:48.853495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:48.853530 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.853512 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:27:48.853530 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.853522 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7579d96757-p2wbq: secret "image-registry-tls" not found Apr 21 06:27:48.853691 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.853535 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:56.85351971 +0000 UTC m=+81.505965507 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : secret "router-metrics-certs-default" not found Apr 21 06:27:48.853691 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.853570 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls podName:20805057-b0bc-4289-a705-2e946efadc98 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:56.853555108 +0000 UTC m=+81.506000905 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls") pod "image-registry-7579d96757-p2wbq" (UID: "20805057-b0bc-4289-a705-2e946efadc98") : secret "image-registry-tls" not found Apr 21 06:27:48.853691 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:48.853599 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:56.853588202 +0000 UTC m=+81.506033998 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : configmap references non-existent config key: service-ca.crt Apr 21 06:27:49.637554 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.637519 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-99g55"] Apr 21 06:27:49.640324 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.640307 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.644027 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.644002 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 06:27:49.644215 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.644002 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-cpcfj\"" Apr 21 06:27:49.644622 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.644599 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 06:27:49.644622 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.644620 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 06:27:49.644852 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.644623 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 06:27:49.650847 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.650826 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-99g55"] Apr 21 06:27:49.696662 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.696634 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gnb44_d222f49e-3ace-4fc2-9344-97a36ac9bc47/node-ca/0.log" Apr 21 06:27:49.761426 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.761389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/90669bf5-ed00-4c9d-bc20-88d224b7c071-signing-cabundle\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.761594 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.761467 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqccb\" (UniqueName: \"kubernetes.io/projected/90669bf5-ed00-4c9d-bc20-88d224b7c071-kube-api-access-kqccb\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.761594 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.761529 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/90669bf5-ed00-4c9d-bc20-88d224b7c071-signing-key\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.862071 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.862036 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/90669bf5-ed00-4c9d-bc20-88d224b7c071-signing-cabundle\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.862207 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.862111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqccb\" (UniqueName: \"kubernetes.io/projected/90669bf5-ed00-4c9d-bc20-88d224b7c071-kube-api-access-kqccb\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.862207 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.862173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/90669bf5-ed00-4c9d-bc20-88d224b7c071-signing-key\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.862674 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.862644 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/90669bf5-ed00-4c9d-bc20-88d224b7c071-signing-cabundle\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.864698 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.864675 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/90669bf5-ed00-4c9d-bc20-88d224b7c071-signing-key\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.869682 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.869655 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqccb\" (UniqueName: \"kubernetes.io/projected/90669bf5-ed00-4c9d-bc20-88d224b7c071-kube-api-access-kqccb\") pod \"service-ca-865cb79987-99g55\" (UID: \"90669bf5-ed00-4c9d-bc20-88d224b7c071\") " pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:49.949015 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:49.948925 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-99g55" Apr 21 06:27:50.061912 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:50.061880 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-99g55"] Apr 21 06:27:50.065031 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:27:50.065005 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90669bf5_ed00_4c9d_bc20_88d224b7c071.slice/crio-3a29d9e7fe4fe8775ff118cc7cb61e0c36126fc21c5f5b105f7b58588a6a0af5 WatchSource:0}: Error finding container 3a29d9e7fe4fe8775ff118cc7cb61e0c36126fc21c5f5b105f7b58588a6a0af5: Status 404 returned error can't find the container with id 3a29d9e7fe4fe8775ff118cc7cb61e0c36126fc21c5f5b105f7b58588a6a0af5 Apr 21 06:27:50.238975 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:50.238930 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-99g55" event={"ID":"90669bf5-ed00-4c9d-bc20-88d224b7c071","Type":"ContainerStarted","Data":"3a29d9e7fe4fe8775ff118cc7cb61e0c36126fc21c5f5b105f7b58588a6a0af5"} Apr 21 06:27:51.349589 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:51.349556 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:51.350092 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:51.349603 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:27:51.350092 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:51.350061 2577 scope.go:117] "RemoveContainer" containerID="00adaeb9b30731207b4ad824d81652851f2f4938c1f152b5f16e92ebb1c92d21" Apr 21 06:27:51.350321 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:51.350290 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fdnr4_openshift-console-operator(6803da32-a76e-4d0e-916c-a12f322ff600)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" podUID="6803da32-a76e-4d0e-916c-a12f322ff600" Apr 21 06:27:52.245532 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:52.245490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-99g55" event={"ID":"90669bf5-ed00-4c9d-bc20-88d224b7c071","Type":"ContainerStarted","Data":"868dedc96a49e57e50f055db6404f1d15d2556c54afcbd780c2b87541ea4a4b6"} Apr 21 06:27:52.259489 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:52.259437 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-99g55" podStartSLOduration=1.716178301 podStartE2EDuration="3.25942279s" podCreationTimestamp="2026-04-21 06:27:49 +0000 UTC" firstStartedPulling="2026-04-21 06:27:50.066875294 +0000 UTC m=+74.719321091" lastFinishedPulling="2026-04-21 06:27:51.610119783 +0000 UTC m=+76.262565580" observedRunningTime="2026-04-21 06:27:52.258616206 +0000 UTC m=+76.911062028" watchObservedRunningTime="2026-04-21 06:27:52.25942279 +0000 UTC m=+76.911868608" Apr 21 06:27:56.825291 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:56.825247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:56.825697 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:56.825381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:27:56.825697 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:56.825461 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:56.825697 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:56.825513 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls podName:634bd9b2-8299-43f0-9124-eb65af43af1e nodeName:}" failed. No retries permitted until 2026-04-21 06:28:12.825499756 +0000 UTC m=+97.477945552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-gmsbd" (UID: "634bd9b2-8299-43f0-9124-eb65af43af1e") : secret "cluster-monitoring-operator-tls" not found Apr 21 06:27:56.828414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:56.828392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/50263a7c-1596-4353-a40d-4453e307fb4f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2g6hc\" (UID: \"50263a7c-1596-4353-a40d-4453e307fb4f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:56.926414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:56.926378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:56.926605 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:56.926415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" Apr 21 06:27:56.926605 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:56.926492 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 06:27:56.926605 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:56.926539 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:12.926525026 +0000 UTC m=+97.578970831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : secret "router-metrics-certs-default" not found Apr 21 06:27:56.926605 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:56.926422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:27:56.926605 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:27:56.926565 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle podName:d743fd44-3762-47ee-9a4c-617f122ba333 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:12.926546905 +0000 UTC m=+97.578992708 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle") pod "router-default-5968f9cfc4-wxd8m" (UID: "d743fd44-3762-47ee-9a4c-617f122ba333") : configmap references non-existent config key: service-ca.crt Apr 21 06:27:56.926908 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:56.926657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:56.931475 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:56.930199 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"image-registry-7579d96757-p2wbq\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:57.045545 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:57.045364 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc"] Apr 21 06:27:57.048376 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:57.048353 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:57.173299 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:57.173268 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7579d96757-p2wbq"] Apr 21 06:27:57.176472 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:27:57.176441 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20805057_b0bc_4289_a705_2e946efadc98.slice/crio-13e26414c9c5e053be761d3cf2a21da8725f9e0fa249035e5de3d8e1ee5f7008 WatchSource:0}: Error finding container 13e26414c9c5e053be761d3cf2a21da8725f9e0fa249035e5de3d8e1ee5f7008: Status 404 returned error can't find the container with id 13e26414c9c5e053be761d3cf2a21da8725f9e0fa249035e5de3d8e1ee5f7008 Apr 21 06:27:57.260374 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:57.260338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" event={"ID":"20805057-b0bc-4289-a705-2e946efadc98","Type":"ContainerStarted","Data":"5a008cbcd9e20460a2cab6f31f2d24520be00cd07bdb272fffc5e391f902ad37"} Apr 21 06:27:57.260534 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:57.260383 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" event={"ID":"20805057-b0bc-4289-a705-2e946efadc98","Type":"ContainerStarted","Data":"13e26414c9c5e053be761d3cf2a21da8725f9e0fa249035e5de3d8e1ee5f7008"} Apr 21 06:27:57.260534 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:57.260460 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:27:57.261422 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:57.261385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" event={"ID":"50263a7c-1596-4353-a40d-4453e307fb4f","Type":"ContainerStarted","Data":"4d2e7d70048fb34602be2eb5c7556f3fab037bb50c157bb7b3a2b4b0651be4fd"} Apr 21 06:27:57.277555 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:57.277497 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" podStartSLOduration=16.277477518 podStartE2EDuration="16.277477518s" podCreationTimestamp="2026-04-21 06:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:27:57.276378078 +0000 UTC m=+81.928823903" watchObservedRunningTime="2026-04-21 06:27:57.277477518 +0000 UTC m=+81.929923337" Apr 21 06:27:59.270283 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:59.270246 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" event={"ID":"50263a7c-1596-4353-a40d-4453e307fb4f","Type":"ContainerStarted","Data":"295ace0e591ec92b5f38aeedaff5bd8dc711178347520bcf2ee57627ec03dd67"} Apr 21 06:27:59.270283 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:59.270282 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" event={"ID":"50263a7c-1596-4353-a40d-4453e307fb4f","Type":"ContainerStarted","Data":"c91daf111c041f49a2d55c62d0e95732dc115de2346630ef9dc4a253646c41c7"} Apr 21 06:27:59.285143 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:27:59.285097 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2g6hc" podStartSLOduration=17.781266811 podStartE2EDuration="19.28508278s" podCreationTimestamp="2026-04-21 06:27:40 +0000 UTC" firstStartedPulling="2026-04-21 06:27:57.099928746 +0000 UTC m=+81.752374543" lastFinishedPulling="2026-04-21 06:27:58.603744714 +0000 UTC m=+83.256190512" observedRunningTime="2026-04-21 06:27:59.28427023 +0000 UTC m=+83.936716050" watchObservedRunningTime="2026-04-21 06:27:59.28508278 +0000 UTC m=+83.937528598" Apr 21 06:28:06.926154 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:06.926125 2577 scope.go:117] "RemoveContainer" containerID="00adaeb9b30731207b4ad824d81652851f2f4938c1f152b5f16e92ebb1c92d21" Apr 21 06:28:07.288702 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:07.288678 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/1.log" Apr 21 06:28:07.288914 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:07.288763 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" event={"ID":"6803da32-a76e-4d0e-916c-a12f322ff600","Type":"ContainerStarted","Data":"d128c99d2284fe9adb811fbf76d789673a69fd3c5bd4d3efbfeaf1fb5f23a294"} Apr 21 06:28:07.289058 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:07.289041 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:28:07.293833 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:07.293815 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" Apr 21 06:28:07.303493 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:07.303451 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-fdnr4" podStartSLOduration=22.997305525 podStartE2EDuration="27.303438925s" podCreationTimestamp="2026-04-21 06:27:40 +0000 UTC" firstStartedPulling="2026-04-21 06:27:41.511616939 +0000 UTC m=+66.164062736" lastFinishedPulling="2026-04-21 06:27:45.817750338 +0000 UTC m=+70.470196136" observedRunningTime="2026-04-21 06:28:07.30242428 +0000 UTC m=+91.954870120" watchObservedRunningTime="2026-04-21 06:28:07.303438925 +0000 UTC m=+91.955884744" Apr 21 06:28:09.228773 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.228716 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2mh7j"] Apr 21 06:28:09.232222 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.232205 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.234421 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.234388 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 06:28:09.235239 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.235219 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7w99g\"" Apr 21 06:28:09.235345 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.235242 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 06:28:09.244568 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.244545 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2mh7j"] Apr 21 06:28:09.269363 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.269330 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7579d96757-p2wbq"] Apr 21 06:28:09.294994 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.294967 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2"] Apr 21 06:28:09.298104 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.298085 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" Apr 21 06:28:09.300250 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.300225 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 06:28:09.300381 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.300232 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 06:28:09.300381 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.300334 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-kl9tj\"" Apr 21 06:28:09.305317 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.305290 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2"] Apr 21 06:28:09.335701 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.335666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c54eb281-5e83-4630-8888-50ac3bdb9ed7-crio-socket\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.335701 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.335698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c54eb281-5e83-4630-8888-50ac3bdb9ed7-data-volume\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.335913 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.335718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d28x\" (UniqueName: \"kubernetes.io/projected/c54eb281-5e83-4630-8888-50ac3bdb9ed7-kube-api-access-8d28x\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.335913 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.335801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c54eb281-5e83-4630-8888-50ac3bdb9ed7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.335913 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.335854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c54eb281-5e83-4630-8888-50ac3bdb9ed7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.436528 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.436483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40dae373-b21d-4d63-9f06-93c00008d853-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4rhg2\" (UID: \"40dae373-b21d-4d63-9f06-93c00008d853\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" Apr 21 06:28:09.436714 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.436666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c54eb281-5e83-4630-8888-50ac3bdb9ed7-crio-socket\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.436714 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.436697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c54eb281-5e83-4630-8888-50ac3bdb9ed7-data-volume\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.436714 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.436715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d28x\" (UniqueName: \"kubernetes.io/projected/c54eb281-5e83-4630-8888-50ac3bdb9ed7-kube-api-access-8d28x\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.436910 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.436793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c54eb281-5e83-4630-8888-50ac3bdb9ed7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.436910 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.436818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c54eb281-5e83-4630-8888-50ac3bdb9ed7-crio-socket\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.436910 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.436872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/40dae373-b21d-4d63-9f06-93c00008d853-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4rhg2\" (UID: \"40dae373-b21d-4d63-9f06-93c00008d853\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" Apr 21 06:28:09.437064 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.436912 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c54eb281-5e83-4630-8888-50ac3bdb9ed7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.437106 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.437093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c54eb281-5e83-4630-8888-50ac3bdb9ed7-data-volume\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.437422 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.437402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c54eb281-5e83-4630-8888-50ac3bdb9ed7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.439266 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.439244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c54eb281-5e83-4630-8888-50ac3bdb9ed7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.444742 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.444704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d28x\" (UniqueName: \"kubernetes.io/projected/c54eb281-5e83-4630-8888-50ac3bdb9ed7-kube-api-access-8d28x\") pod \"insights-runtime-extractor-2mh7j\" (UID: \"c54eb281-5e83-4630-8888-50ac3bdb9ed7\") " pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.538321 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.538239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40dae373-b21d-4d63-9f06-93c00008d853-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4rhg2\" (UID: \"40dae373-b21d-4d63-9f06-93c00008d853\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" Apr 21 06:28:09.538321 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.538310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/40dae373-b21d-4d63-9f06-93c00008d853-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4rhg2\" (UID: \"40dae373-b21d-4d63-9f06-93c00008d853\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" Apr 21 06:28:09.538946 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.538919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40dae373-b21d-4d63-9f06-93c00008d853-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4rhg2\" (UID: \"40dae373-b21d-4d63-9f06-93c00008d853\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" Apr 21 06:28:09.540955 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.540935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/40dae373-b21d-4d63-9f06-93c00008d853-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4rhg2\" (UID: \"40dae373-b21d-4d63-9f06-93c00008d853\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" Apr 21 06:28:09.541065 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.541053 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2mh7j" Apr 21 06:28:09.607470 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.607444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" Apr 21 06:28:09.668403 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.668354 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2mh7j"] Apr 21 06:28:09.672464 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:09.672435 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54eb281_5e83_4630_8888_50ac3bdb9ed7.slice/crio-5c8c2899f3dc4de537665839e1b81df1c522ecf22c0a7abc1771c3aca54b6632 WatchSource:0}: Error finding container 5c8c2899f3dc4de537665839e1b81df1c522ecf22c0a7abc1771c3aca54b6632: Status 404 returned error can't find the container with id 5c8c2899f3dc4de537665839e1b81df1c522ecf22c0a7abc1771c3aca54b6632 Apr 21 06:28:09.735256 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:09.735230 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2"] Apr 21 06:28:09.738196 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:09.738164 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40dae373_b21d_4d63_9f06_93c00008d853.slice/crio-8cf36fdddb4249afef1f2c0245492f9ae5c0e08982ccce8406bcdb620803229a WatchSource:0}: Error finding container 8cf36fdddb4249afef1f2c0245492f9ae5c0e08982ccce8406bcdb620803229a: Status 404 returned error can't find the container with id 8cf36fdddb4249afef1f2c0245492f9ae5c0e08982ccce8406bcdb620803229a Apr 21 06:28:10.299323 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:10.299273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" event={"ID":"40dae373-b21d-4d63-9f06-93c00008d853","Type":"ContainerStarted","Data":"8cf36fdddb4249afef1f2c0245492f9ae5c0e08982ccce8406bcdb620803229a"} Apr 21 06:28:10.301194 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:10.301167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2mh7j" event={"ID":"c54eb281-5e83-4630-8888-50ac3bdb9ed7","Type":"ContainerStarted","Data":"65245ac82a6a5935a5c5e3b472b2a8d7052f065b2323e9c344242d804f1d22fa"} Apr 21 06:28:10.301194 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:10.301202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2mh7j" event={"ID":"c54eb281-5e83-4630-8888-50ac3bdb9ed7","Type":"ContainerStarted","Data":"5c8c2899f3dc4de537665839e1b81df1c522ecf22c0a7abc1771c3aca54b6632"} Apr 21 06:28:11.307568 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:11.307529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2mh7j" event={"ID":"c54eb281-5e83-4630-8888-50ac3bdb9ed7","Type":"ContainerStarted","Data":"6ed7666146d9e6d50ec0c71d2d0d19839bd435f18ddfe7ef34c456854b11102c"} Apr 21 06:28:11.308847 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:11.308822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" event={"ID":"40dae373-b21d-4d63-9f06-93c00008d853","Type":"ContainerStarted","Data":"d068f04acaaec3adaaeca807b02bf55cdaac45c6d0cc7fbcff19ac784c3fe432"} Apr 21 06:28:11.322968 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:11.322929 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4rhg2" podStartSLOduration=1.427459705 podStartE2EDuration="2.322916672s" podCreationTimestamp="2026-04-21 06:28:09 +0000 UTC" firstStartedPulling="2026-04-21 06:28:09.740174101 +0000 UTC m=+94.392619907" lastFinishedPulling="2026-04-21 06:28:10.635631076 +0000 UTC m=+95.288076874" observedRunningTime="2026-04-21 06:28:11.321799209 +0000 UTC m=+95.974245029" watchObservedRunningTime="2026-04-21 06:28:11.322916672 +0000 UTC m=+95.975362482" Apr 21 06:28:12.313357 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:12.313323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2mh7j" event={"ID":"c54eb281-5e83-4630-8888-50ac3bdb9ed7","Type":"ContainerStarted","Data":"31b0fce94786f2cf88d4336b0208dc1654b19569d0ba6e09f8d67b8ef718caab"} Apr 21 06:28:12.329030 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:12.328976 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2mh7j" podStartSLOduration=1.501391018 podStartE2EDuration="3.328956141s" podCreationTimestamp="2026-04-21 06:28:09 +0000 UTC" firstStartedPulling="2026-04-21 06:28:09.751086064 +0000 UTC m=+94.403531862" lastFinishedPulling="2026-04-21 06:28:11.578651189 +0000 UTC m=+96.231096985" observedRunningTime="2026-04-21 06:28:12.327902481 +0000 UTC m=+96.980348334" watchObservedRunningTime="2026-04-21 06:28:12.328956141 +0000 UTC m=+96.981401963" Apr 21 06:28:12.865213 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:12.865178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:28:12.867845 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:12.867817 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/634bd9b2-8299-43f0-9124-eb65af43af1e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-gmsbd\" (UID: \"634bd9b2-8299-43f0-9124-eb65af43af1e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:28:12.967678 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:12.967636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:12.967862 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:12.967843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:12.968424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:12.968402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d743fd44-3762-47ee-9a4c-617f122ba333-service-ca-bundle\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:12.970210 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:12.970191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d743fd44-3762-47ee-9a4c-617f122ba333-metrics-certs\") pod \"router-default-5968f9cfc4-wxd8m\" (UID: \"d743fd44-3762-47ee-9a4c-617f122ba333\") " pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:13.114506 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.114453 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" Apr 21 06:28:13.169809 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.169773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:28:13.172526 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.172483 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85b9a72-4484-46f7-bab5-6a307b7bd43f-metrics-tls\") pod \"dns-default-rkcmp\" (UID: \"f85b9a72-4484-46f7-bab5-6a307b7bd43f\") " pod="openshift-dns/dns-default-rkcmp" Apr 21 06:28:13.182076 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.182056 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz466\"" Apr 21 06:28:13.190966 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.190925 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rkcmp" Apr 21 06:28:13.192792 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.192717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:13.250067 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.249986 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd"] Apr 21 06:28:13.258435 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:13.257052 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod634bd9b2_8299_43f0_9124_eb65af43af1e.slice/crio-2d3e5dd41c53ea47ed3980c68c6c6b261dbae814816a4fd3ac127271810e4a70 WatchSource:0}: Error finding container 2d3e5dd41c53ea47ed3980c68c6c6b261dbae814816a4fd3ac127271810e4a70: Status 404 returned error can't find the container with id 2d3e5dd41c53ea47ed3980c68c6c6b261dbae814816a4fd3ac127271810e4a70 Apr 21 06:28:13.271203 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.271169 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:28:13.273911 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.273846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3336a9c5-62bd-44a2-8149-ccbdebfdb50a-cert\") pod \"ingress-canary-2dlsg\" (UID: \"3336a9c5-62bd-44a2-8149-ccbdebfdb50a\") " pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:28:13.320905 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.320865 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" event={"ID":"634bd9b2-8299-43f0-9124-eb65af43af1e","Type":"ContainerStarted","Data":"2d3e5dd41c53ea47ed3980c68c6c6b261dbae814816a4fd3ac127271810e4a70"} Apr 21 06:28:13.323520 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.323493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rkcmp"] Apr 21 06:28:13.326624 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:13.326596 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85b9a72_4484_46f7_bab5_6a307b7bd43f.slice/crio-431f65dfb99b413739d74366778595d3e1988122879f07063761b28ef1e74877 WatchSource:0}: Error finding container 431f65dfb99b413739d74366778595d3e1988122879f07063761b28ef1e74877: Status 404 returned error can't find the container with id 431f65dfb99b413739d74366778595d3e1988122879f07063761b28ef1e74877 Apr 21 06:28:13.342131 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.342105 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5968f9cfc4-wxd8m"] Apr 21 06:28:13.345297 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:13.345268 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd743fd44_3762_47ee_9a4c_617f122ba333.slice/crio-4bee0a283fa7de55c45006d80c9ca791b90055658c8cdd0b2626d851ef1afdbd WatchSource:0}: Error finding container 4bee0a283fa7de55c45006d80c9ca791b90055658c8cdd0b2626d851ef1afdbd: Status 404 returned error can't find the container with id 4bee0a283fa7de55c45006d80c9ca791b90055658c8cdd0b2626d851ef1afdbd Apr 21 06:28:13.496444 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.496365 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-98jt7\"" Apr 21 06:28:13.504224 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.504200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2dlsg" Apr 21 06:28:13.623947 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:13.623915 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2dlsg"] Apr 21 06:28:13.627196 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:13.627166 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3336a9c5_62bd_44a2_8149_ccbdebfdb50a.slice/crio-2a939e4fb1a057f5602983b72f81bb5034615741ec3633228bf98de5288a20cb WatchSource:0}: Error finding container 2a939e4fb1a057f5602983b72f81bb5034615741ec3633228bf98de5288a20cb: Status 404 returned error can't find the container with id 2a939e4fb1a057f5602983b72f81bb5034615741ec3633228bf98de5288a20cb Apr 21 06:28:14.324684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:14.324639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkcmp" event={"ID":"f85b9a72-4484-46f7-bab5-6a307b7bd43f","Type":"ContainerStarted","Data":"431f65dfb99b413739d74366778595d3e1988122879f07063761b28ef1e74877"} Apr 21 06:28:14.325835 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:14.325806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2dlsg" event={"ID":"3336a9c5-62bd-44a2-8149-ccbdebfdb50a","Type":"ContainerStarted","Data":"2a939e4fb1a057f5602983b72f81bb5034615741ec3633228bf98de5288a20cb"} Apr 21 06:28:14.327288 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:14.327261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" event={"ID":"d743fd44-3762-47ee-9a4c-617f122ba333","Type":"ContainerStarted","Data":"f2822e12ef391d95744344291954984a25b19119d2d27045e01bbe4e84ad6ae0"} Apr 21 06:28:14.327408 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:14.327296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" event={"ID":"d743fd44-3762-47ee-9a4c-617f122ba333","Type":"ContainerStarted","Data":"4bee0a283fa7de55c45006d80c9ca791b90055658c8cdd0b2626d851ef1afdbd"} Apr 21 06:28:14.346978 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:14.346929 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" podStartSLOduration=34.346913118 podStartE2EDuration="34.346913118s" podCreationTimestamp="2026-04-21 06:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:28:14.345384496 +0000 UTC m=+98.997830328" watchObservedRunningTime="2026-04-21 06:28:14.346913118 +0000 UTC m=+98.999358938" Apr 21 06:28:15.193406 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:15.193362 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:15.195824 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:15.195802 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:15.330885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:15.330853 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:15.332281 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:15.332260 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5968f9cfc4-wxd8m" Apr 21 06:28:16.335591 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:16.335487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2dlsg" event={"ID":"3336a9c5-62bd-44a2-8149-ccbdebfdb50a","Type":"ContainerStarted","Data":"545f243e52cb39a984459179418be931760ef58f632cabb73c8ac8133a595e96"} Apr 21 06:28:16.337577 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:16.337544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkcmp" event={"ID":"f85b9a72-4484-46f7-bab5-6a307b7bd43f","Type":"ContainerStarted","Data":"717661c2b8fdee0193db3e877e4184294b24a90d1fc179db46309438bab5ee6b"} Apr 21 06:28:16.337577 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:16.337578 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkcmp" event={"ID":"f85b9a72-4484-46f7-bab5-6a307b7bd43f","Type":"ContainerStarted","Data":"912661b9d4591dab8b29548b2213b199bc502398bbf482e9afad9b2b5d9bcbb2"} Apr 21 06:28:16.337807 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:16.337662 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rkcmp" Apr 21 06:28:16.339080 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:16.339059 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" event={"ID":"634bd9b2-8299-43f0-9124-eb65af43af1e","Type":"ContainerStarted","Data":"875c1fb679abea9ba221c14e070d83eab6736b4195da8c653abf4d3dcb8ab021"} Apr 21 06:28:16.349189 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:16.349147 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2dlsg" podStartSLOduration=65.234489088 podStartE2EDuration="1m7.349137721s" podCreationTimestamp="2026-04-21 06:27:09 +0000 UTC" firstStartedPulling="2026-04-21 06:28:13.628967366 +0000 UTC m=+98.281413165" lastFinishedPulling="2026-04-21 06:28:15.743615994 +0000 UTC m=+100.396061798" observedRunningTime="2026-04-21 06:28:16.349068629 +0000 UTC m=+101.001514449" watchObservedRunningTime="2026-04-21 06:28:16.349137721 +0000 UTC m=+101.001583540" Apr 21 06:28:16.364842 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:16.364784 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rkcmp" podStartSLOduration=64.956250851 podStartE2EDuration="1m7.364767037s" podCreationTimestamp="2026-04-21 06:27:09 +0000 UTC" firstStartedPulling="2026-04-21 06:28:13.32851184 +0000 UTC m=+97.980957638" lastFinishedPulling="2026-04-21 06:28:15.73702802 +0000 UTC m=+100.389473824" observedRunningTime="2026-04-21 06:28:16.363789122 +0000 UTC m=+101.016234940" watchObservedRunningTime="2026-04-21 06:28:16.364767037 +0000 UTC m=+101.017212855" Apr 21 06:28:16.379396 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:16.379344 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-gmsbd" podStartSLOduration=33.902465921 podStartE2EDuration="36.379332108s" podCreationTimestamp="2026-04-21 06:27:40 +0000 UTC" firstStartedPulling="2026-04-21 06:28:13.260557462 +0000 UTC m=+97.913003269" lastFinishedPulling="2026-04-21 06:28:15.737423658 +0000 UTC m=+100.389869456" observedRunningTime="2026-04-21 06:28:16.378861501 +0000 UTC m=+101.031307321" watchObservedRunningTime="2026-04-21 06:28:16.379332108 +0000 UTC m=+101.031777928" Apr 21 06:28:17.231828 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:17.231798 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-thvnj" Apr 21 06:28:19.274798 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.274770 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:28:19.316103 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.316074 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-trw9w"] Apr 21 06:28:19.348636 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.348600 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-trw9w"] Apr 21 06:28:19.348829 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.348756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.351168 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.351145 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 06:28:19.351296 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.351233 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 06:28:19.351296 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.351243 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 06:28:19.351491 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.351475 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-86rd4\"" Apr 21 06:28:19.421618 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.421592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.421814 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.421631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.421814 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.421654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa012fc-52ee-4eb5-91dc-c3f733196956-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.421814 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.421710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hrm9\" (UniqueName: \"kubernetes.io/projected/3fa012fc-52ee-4eb5-91dc-c3f733196956-kube-api-access-5hrm9\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.522796 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.522752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.523004 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.522808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.523004 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.522831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa012fc-52ee-4eb5-91dc-c3f733196956-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.523004 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.522851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hrm9\" (UniqueName: \"kubernetes.io/projected/3fa012fc-52ee-4eb5-91dc-c3f733196956-kube-api-access-5hrm9\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.523004 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:28:19.522969 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 06:28:19.523218 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:28:19.523061 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-tls podName:3fa012fc-52ee-4eb5-91dc-c3f733196956 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:20.023040914 +0000 UTC m=+104.675486725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-trw9w" (UID: "3fa012fc-52ee-4eb5-91dc-c3f733196956") : secret "prometheus-operator-tls" not found Apr 21 06:28:19.523633 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.523611 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa012fc-52ee-4eb5-91dc-c3f733196956-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.525348 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.525290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:19.531440 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:19.531417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hrm9\" (UniqueName: \"kubernetes.io/projected/3fa012fc-52ee-4eb5-91dc-c3f733196956-kube-api-access-5hrm9\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:20.027438 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.027407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:20.029997 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.029964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa012fc-52ee-4eb5-91dc-c3f733196956-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-trw9w\" (UID: \"3fa012fc-52ee-4eb5-91dc-c3f733196956\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:20.258470 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.258435 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" Apr 21 06:28:20.377169 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.377034 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-trw9w"] Apr 21 06:28:20.379592 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:20.379558 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa012fc_52ee_4eb5_91dc_c3f733196956.slice/crio-95b87197f0103c923cf2f125c0e9f6302d356f8436388c1ec991fbbf83acef95 WatchSource:0}: Error finding container 95b87197f0103c923cf2f125c0e9f6302d356f8436388c1ec991fbbf83acef95: Status 404 returned error can't find the container with id 95b87197f0103c923cf2f125c0e9f6302d356f8436388c1ec991fbbf83acef95 Apr 21 06:28:20.515076 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.515036 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68f6966bf9-flnb6"] Apr 21 06:28:20.538050 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.538023 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f6966bf9-flnb6"] Apr 21 06:28:20.538182 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.538133 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.540495 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.540468 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 06:28:20.540495 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.540482 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 06:28:20.540704 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.540540 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 06:28:20.540704 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.540542 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 06:28:20.540704 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.540585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 06:28:20.540704 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.540468 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 06:28:20.541252 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.541236 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-rk6ld\"" Apr 21 06:28:20.541329 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.541280 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 06:28:20.631858 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.631773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-service-ca\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.631858 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.631842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-config\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.632060 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.631866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-oauth-config\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.632060 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.631892 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-serving-cert\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.632060 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.631920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-oauth-serving-cert\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.632060 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.631942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzck\" (UniqueName: \"kubernetes.io/projected/bda38237-5a37-4a09-ae06-dfa8f2dc1240-kube-api-access-grzck\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.732323 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.732286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-config\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.732476 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.732332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-oauth-config\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.732476 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.732368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-serving-cert\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.732476 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.732399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-oauth-serving-cert\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.732476 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.732417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grzck\" (UniqueName: \"kubernetes.io/projected/bda38237-5a37-4a09-ae06-dfa8f2dc1240-kube-api-access-grzck\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.732476 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.732442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-service-ca\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.733104 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.733043 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-config\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.733104 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.733089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-oauth-serving-cert\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.733104 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.733104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-service-ca\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.734973 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.734946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-oauth-config\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.735077 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.734998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-serving-cert\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.739976 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.739949 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzck\" (UniqueName: \"kubernetes.io/projected/bda38237-5a37-4a09-ae06-dfa8f2dc1240-kube-api-access-grzck\") pod \"console-68f6966bf9-flnb6\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.847438 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.847398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:20.986883 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:20.986847 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68f6966bf9-flnb6"] Apr 21 06:28:20.994912 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:20.994881 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda38237_5a37_4a09_ae06_dfa8f2dc1240.slice/crio-aaab1c103a755fab8eeae88ddb2e6f0c4c7e13e65bef422132afc31b54ae979c WatchSource:0}: Error finding container aaab1c103a755fab8eeae88ddb2e6f0c4c7e13e65bef422132afc31b54ae979c: Status 404 returned error can't find the container with id aaab1c103a755fab8eeae88ddb2e6f0c4c7e13e65bef422132afc31b54ae979c Apr 21 06:28:21.356644 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:21.356599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f6966bf9-flnb6" event={"ID":"bda38237-5a37-4a09-ae06-dfa8f2dc1240","Type":"ContainerStarted","Data":"aaab1c103a755fab8eeae88ddb2e6f0c4c7e13e65bef422132afc31b54ae979c"} Apr 21 06:28:21.357743 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:21.357705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" event={"ID":"3fa012fc-52ee-4eb5-91dc-c3f733196956","Type":"ContainerStarted","Data":"95b87197f0103c923cf2f125c0e9f6302d356f8436388c1ec991fbbf83acef95"} Apr 21 06:28:22.361951 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:22.361908 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" event={"ID":"3fa012fc-52ee-4eb5-91dc-c3f733196956","Type":"ContainerStarted","Data":"cd1831820d7bdbb1a90afcdcbb989cc6ffaa8f20b675d62c3a461dcf55c4ee2a"} Apr 21 06:28:22.361951 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:22.361953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" event={"ID":"3fa012fc-52ee-4eb5-91dc-c3f733196956","Type":"ContainerStarted","Data":"6518dc761a3fb66c9fe4baed58a4168e1e366e8e4ba52a849569f7e7243f83cf"} Apr 21 06:28:22.379933 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:22.379871 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-trw9w" podStartSLOduration=1.847728034 podStartE2EDuration="3.37985304s" podCreationTimestamp="2026-04-21 06:28:19 +0000 UTC" firstStartedPulling="2026-04-21 06:28:20.381393918 +0000 UTC m=+105.033839717" lastFinishedPulling="2026-04-21 06:28:21.91351892 +0000 UTC m=+106.565964723" observedRunningTime="2026-04-21 06:28:22.377251576 +0000 UTC m=+107.029697396" watchObservedRunningTime="2026-04-21 06:28:22.37985304 +0000 UTC m=+107.032298860" Apr 21 06:28:24.371031 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.370944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f6966bf9-flnb6" event={"ID":"bda38237-5a37-4a09-ae06-dfa8f2dc1240","Type":"ContainerStarted","Data":"20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604"} Apr 21 06:28:24.394784 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.394716 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68f6966bf9-flnb6" podStartSLOduration=1.317048075 podStartE2EDuration="4.394697348s" podCreationTimestamp="2026-04-21 06:28:20 +0000 UTC" firstStartedPulling="2026-04-21 06:28:20.996839463 +0000 UTC m=+105.649285265" lastFinishedPulling="2026-04-21 06:28:24.074488738 +0000 UTC m=+108.726934538" observedRunningTime="2026-04-21 06:28:24.393563202 +0000 UTC m=+109.046009022" watchObservedRunningTime="2026-04-21 06:28:24.394697348 +0000 UTC m=+109.047143166" Apr 21 06:28:24.658906 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.658825 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8m8jk"] Apr 21 06:28:24.662678 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.662656 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.664715 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.664689 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 06:28:24.664857 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.664747 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 06:28:24.665041 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.665011 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7m4pt\"" Apr 21 06:28:24.665111 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.665082 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 06:28:24.769475 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-textfile\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.769475 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769473 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-wtmp\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.769718 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09e05b13-b498-4dc1-8766-74242c9ea87e-metrics-client-ca\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.769718 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769561 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-tls\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.769718 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-root\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.769718 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769658 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-sys\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.769718 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2gkb\" (UniqueName: \"kubernetes.io/projected/09e05b13-b498-4dc1-8766-74242c9ea87e-kube-api-access-d2gkb\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.769988 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.769988 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.769764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-accelerators-collector-config\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.870966 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.870927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-root\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871153 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.870981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-sys\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871153 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2gkb\" (UniqueName: \"kubernetes.io/projected/09e05b13-b498-4dc1-8766-74242c9ea87e-kube-api-access-d2gkb\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871153 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-root\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871153 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-sys\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871153 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-accelerators-collector-config\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-textfile\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-wtmp\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09e05b13-b498-4dc1-8766-74242c9ea87e-metrics-client-ca\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-tls\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871656 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-wtmp\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871656 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871544 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-textfile\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.871968 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.871945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09e05b13-b498-4dc1-8766-74242c9ea87e-metrics-client-ca\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.872040 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.872016 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-accelerators-collector-config\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.873540 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.873518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.873768 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.873752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/09e05b13-b498-4dc1-8766-74242c9ea87e-node-exporter-tls\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.881881 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.881847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2gkb\" (UniqueName: \"kubernetes.io/projected/09e05b13-b498-4dc1-8766-74242c9ea87e-kube-api-access-d2gkb\") pod \"node-exporter-8m8jk\" (UID: \"09e05b13-b498-4dc1-8766-74242c9ea87e\") " pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:24.974611 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:24.974576 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8m8jk" Apr 21 06:28:25.374845 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:25.374761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8m8jk" event={"ID":"09e05b13-b498-4dc1-8766-74242c9ea87e","Type":"ContainerStarted","Data":"4b8eecccacab198cf9e9be8d6d2a3991f635a8a9eea93ee6b4a971e755c7a7a8"} Apr 21 06:28:26.348757 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:26.348712 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rkcmp" Apr 21 06:28:26.379242 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:26.379209 2577 generic.go:358] "Generic (PLEG): container finished" podID="09e05b13-b498-4dc1-8766-74242c9ea87e" containerID="e38bc37e71e4c3c065ae5308c0791403c66f2978142bd9e4b5027816b76e8f4d" exitCode=0 Apr 21 06:28:26.379675 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:26.379287 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8m8jk" event={"ID":"09e05b13-b498-4dc1-8766-74242c9ea87e","Type":"ContainerDied","Data":"e38bc37e71e4c3c065ae5308c0791403c66f2978142bd9e4b5027816b76e8f4d"} Apr 21 06:28:27.308185 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.308150 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86974fd9db-9pnxc"] Apr 21 06:28:27.311524 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.311501 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.318895 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.318860 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 06:28:27.320225 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.320200 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86974fd9db-9pnxc"] Apr 21 06:28:27.383765 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.383708 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8m8jk" event={"ID":"09e05b13-b498-4dc1-8766-74242c9ea87e","Type":"ContainerStarted","Data":"37862e5ee0f166616ce392c5dfb7623c4afa461182c725a92238bcfd1bd11ae5"} Apr 21 06:28:27.384115 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.383772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8m8jk" event={"ID":"09e05b13-b498-4dc1-8766-74242c9ea87e","Type":"ContainerStarted","Data":"2ef4c26bb95f04ab68aecba9e1bb6ecb9f799c72a4f9282b0c6e0078ebc2dab2"} Apr 21 06:28:27.395882 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.395856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-oauth-serving-cert\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.396028 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.395929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-console-config\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.396028 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.395969 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxwvh\" (UniqueName: \"kubernetes.io/projected/c7c85c7b-7eef-408f-8525-051838d8f407-kube-api-access-rxwvh\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.396028 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.395999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-service-ca\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.396028 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.396021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-trusted-ca-bundle\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.396177 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.396039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-oauth-config\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.396177 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.396064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-serving-cert\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.402828 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.402781 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8m8jk" podStartSLOduration=2.467773653 podStartE2EDuration="3.402765446s" podCreationTimestamp="2026-04-21 06:28:24 +0000 UTC" firstStartedPulling="2026-04-21 06:28:24.98931748 +0000 UTC m=+109.641763282" lastFinishedPulling="2026-04-21 06:28:25.924309264 +0000 UTC m=+110.576755075" observedRunningTime="2026-04-21 06:28:27.40168185 +0000 UTC m=+112.054127669" watchObservedRunningTime="2026-04-21 06:28:27.402765446 +0000 UTC m=+112.055211267" Apr 21 06:28:27.496983 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.496913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-trusted-ca-bundle\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.497150 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.496991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-oauth-config\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.497150 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.497022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-serving-cert\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.497625 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.497595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-oauth-serving-cert\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.497797 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.497777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-console-config\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.497881 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.497861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxwvh\" (UniqueName: \"kubernetes.io/projected/c7c85c7b-7eef-408f-8525-051838d8f407-kube-api-access-rxwvh\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.497938 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.497919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-service-ca\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.498414 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.498385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-trusted-ca-bundle\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.499231 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.499204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-service-ca\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.499231 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.499202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-oauth-serving-cert\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.500027 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.500007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-console-config\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.500402 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.500376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-oauth-config\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.503888 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.503865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-serving-cert\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.510480 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.510460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxwvh\" (UniqueName: \"kubernetes.io/projected/c7c85c7b-7eef-408f-8525-051838d8f407-kube-api-access-rxwvh\") pod \"console-86974fd9db-9pnxc\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.621915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.621817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:27.760201 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:27.760166 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86974fd9db-9pnxc"] Apr 21 06:28:27.763279 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:27.763244 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c85c7b_7eef_408f_8525_051838d8f407.slice/crio-8fa48c3b6e967f2fa3c4454d7312135ee70c59991c956afd8a883918a9e5276b WatchSource:0}: Error finding container 8fa48c3b6e967f2fa3c4454d7312135ee70c59991c956afd8a883918a9e5276b: Status 404 returned error can't find the container with id 8fa48c3b6e967f2fa3c4454d7312135ee70c59991c956afd8a883918a9e5276b Apr 21 06:28:28.389067 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:28.389020 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86974fd9db-9pnxc" event={"ID":"c7c85c7b-7eef-408f-8525-051838d8f407","Type":"ContainerStarted","Data":"775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1"} Apr 21 06:28:28.389067 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:28.389067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86974fd9db-9pnxc" event={"ID":"c7c85c7b-7eef-408f-8525-051838d8f407","Type":"ContainerStarted","Data":"8fa48c3b6e967f2fa3c4454d7312135ee70c59991c956afd8a883918a9e5276b"} Apr 21 06:28:28.406033 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:28.405976 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86974fd9db-9pnxc" podStartSLOduration=1.405961066 podStartE2EDuration="1.405961066s" podCreationTimestamp="2026-04-21 06:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:28:28.404810607 +0000 UTC m=+113.057256426" watchObservedRunningTime="2026-04-21 06:28:28.405961066 +0000 UTC m=+113.058406884" Apr 21 06:28:29.854633 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.854602 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc"] Apr 21 06:28:29.860264 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.860242 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:29.862905 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.862801 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 06:28:29.862905 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.862810 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 06:28:29.862905 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.862873 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 06:28:29.863132 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.862905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-tvmd6\"" Apr 21 06:28:29.863132 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.863111 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 06:28:29.863224 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.863191 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 06:28:29.868348 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.868328 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 06:28:29.869095 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.869073 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc"] Apr 21 06:28:29.922797 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.922761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-telemeter-client-tls\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:29.922965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.922823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-federate-client-tls\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:29.922965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.922877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:29.922965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.922949 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z6rs\" (UniqueName: \"kubernetes.io/projected/c41a9151-74f0-4a24-a44f-64934f976f61-kube-api-access-2z6rs\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:29.923065 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.922984 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-serving-certs-ca-bundle\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:29.923065 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.923018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-telemeter-trusted-ca-bundle\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:29.923065 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.923052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-secret-telemeter-client\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:29.923169 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:29.923088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-metrics-client-ca\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.024433 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.024399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-federate-client-tls\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.024433 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.024438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.024664 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.024466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z6rs\" (UniqueName: \"kubernetes.io/projected/c41a9151-74f0-4a24-a44f-64934f976f61-kube-api-access-2z6rs\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.024664 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.024532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-serving-certs-ca-bundle\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.024664 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.024593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-telemeter-trusted-ca-bundle\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.024664 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.024628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-secret-telemeter-client\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.024664 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.024653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-metrics-client-ca\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.024969 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.024830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-telemeter-client-tls\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.025552 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.025527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-serving-certs-ca-bundle\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.025657 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.025576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-metrics-client-ca\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.026133 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.026115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41a9151-74f0-4a24-a44f-64934f976f61-telemeter-trusted-ca-bundle\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.027258 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.027229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.027258 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.027246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-federate-client-tls\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.027767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.027745 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-telemeter-client-tls\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.027767 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.027759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c41a9151-74f0-4a24-a44f-64934f976f61-secret-telemeter-client\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.051466 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.051440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z6rs\" (UniqueName: \"kubernetes.io/projected/c41a9151-74f0-4a24-a44f-64934f976f61-kube-api-access-2z6rs\") pod \"telemeter-client-785b4dc9d7-nfpwc\" (UID: \"c41a9151-74f0-4a24-a44f-64934f976f61\") " pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.132612 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.132523 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68f6966bf9-flnb6"] Apr 21 06:28:30.169549 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.169515 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68d755874-bt42h"] Apr 21 06:28:30.172149 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.172131 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" Apr 21 06:28:30.174112 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.174061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.183117 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.183090 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68d755874-bt42h"] Apr 21 06:28:30.227454 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.227358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-trusted-ca-bundle\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.227607 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.227461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-oauth-config\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.227607 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.227524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-serving-cert\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.227607 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.227556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jl8w\" (UniqueName: \"kubernetes.io/projected/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-kube-api-access-7jl8w\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.227607 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.227587 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-config\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.227840 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.227611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-oauth-serving-cert\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.227840 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.227694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-service-ca\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.302150 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.302034 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc"] Apr 21 06:28:30.304330 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:30.304308 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41a9151_74f0_4a24_a44f_64934f976f61.slice/crio-735b65889a2332af3dd8d49c830943609254387ec775dfb3a60eedbb2da24efa WatchSource:0}: Error finding container 735b65889a2332af3dd8d49c830943609254387ec775dfb3a60eedbb2da24efa: Status 404 returned error can't find the container with id 735b65889a2332af3dd8d49c830943609254387ec775dfb3a60eedbb2da24efa Apr 21 06:28:30.328786 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.328758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-serving-cert\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.328923 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.328802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jl8w\" (UniqueName: \"kubernetes.io/projected/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-kube-api-access-7jl8w\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.328983 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.328931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-config\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.328983 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.328972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-oauth-serving-cert\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.329092 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.329003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-service-ca\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.329092 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.329085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-trusted-ca-bundle\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.329191 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.329150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-oauth-config\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.329672 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.329650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-oauth-serving-cert\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.329823 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.329700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-config\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.330514 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.330488 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-service-ca\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.330578 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.330558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-trusted-ca-bundle\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.331905 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.331883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-oauth-config\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.331985 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.331971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-serving-cert\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.338873 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.338852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jl8w\" (UniqueName: \"kubernetes.io/projected/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-kube-api-access-7jl8w\") pod \"console-68d755874-bt42h\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.400680 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.400592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" event={"ID":"c41a9151-74f0-4a24-a44f-64934f976f61","Type":"ContainerStarted","Data":"735b65889a2332af3dd8d49c830943609254387ec775dfb3a60eedbb2da24efa"} Apr 21 06:28:30.501358 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.501315 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:30.639746 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.639698 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68d755874-bt42h"] Apr 21 06:28:30.642943 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:30.642912 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe505ae2_2ddf_4cf9_a5ae_fa6b5e9f279b.slice/crio-328d95c02019418451b45889a64025afddec3cbda7d07ed9c75f712c3721ec18 WatchSource:0}: Error finding container 328d95c02019418451b45889a64025afddec3cbda7d07ed9c75f712c3721ec18: Status 404 returned error can't find the container with id 328d95c02019418451b45889a64025afddec3cbda7d07ed9c75f712c3721ec18 Apr 21 06:28:30.848019 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:30.847958 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:31.404875 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:31.404841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d755874-bt42h" event={"ID":"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b","Type":"ContainerStarted","Data":"12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721"} Apr 21 06:28:31.404875 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:31.404874 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d755874-bt42h" event={"ID":"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b","Type":"ContainerStarted","Data":"328d95c02019418451b45889a64025afddec3cbda7d07ed9c75f712c3721ec18"} Apr 21 06:28:31.420464 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:31.420418 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68d755874-bt42h" podStartSLOduration=1.420406414 podStartE2EDuration="1.420406414s" podCreationTimestamp="2026-04-21 06:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:28:31.420170974 +0000 UTC m=+116.072616792" watchObservedRunningTime="2026-04-21 06:28:31.420406414 +0000 UTC m=+116.072852233" Apr 21 06:28:32.409605 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:32.409502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" event={"ID":"c41a9151-74f0-4a24-a44f-64934f976f61","Type":"ContainerStarted","Data":"ef9ff3012cf828fca49ba0fa12afb396a52b5fc55efe45a35a8aa5526f5957ec"} Apr 21 06:28:33.418234 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:33.418200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" event={"ID":"c41a9151-74f0-4a24-a44f-64934f976f61","Type":"ContainerStarted","Data":"2851fc5cb2eccf72a69d0ad413002faf08bce69021af41284ae3720d5f489475"} Apr 21 06:28:33.418588 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:33.418241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" event={"ID":"c41a9151-74f0-4a24-a44f-64934f976f61","Type":"ContainerStarted","Data":"8e216390a6d9c17ed7166b15baad6a9287e78a57629a853e991965909598edbb"} Apr 21 06:28:33.439697 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:33.439602 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-785b4dc9d7-nfpwc" podStartSLOduration=1.574677369 podStartE2EDuration="4.439588994s" podCreationTimestamp="2026-04-21 06:28:29 +0000 UTC" firstStartedPulling="2026-04-21 06:28:30.306140578 +0000 UTC m=+114.958586375" lastFinishedPulling="2026-04-21 06:28:33.171052192 +0000 UTC m=+117.823498000" observedRunningTime="2026-04-21 06:28:33.438210093 +0000 UTC m=+118.090655912" watchObservedRunningTime="2026-04-21 06:28:33.439588994 +0000 UTC m=+118.092034790" Apr 21 06:28:34.288946 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.288904 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" podUID="20805057-b0bc-4289-a705-2e946efadc98" containerName="registry" containerID="cri-o://5a008cbcd9e20460a2cab6f31f2d24520be00cd07bdb272fffc5e391f902ad37" gracePeriod=30 Apr 21 06:28:34.401223 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.401186 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86974fd9db-9pnxc"] Apr 21 06:28:34.422706 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.422672 2577 generic.go:358] "Generic (PLEG): container finished" podID="20805057-b0bc-4289-a705-2e946efadc98" containerID="5a008cbcd9e20460a2cab6f31f2d24520be00cd07bdb272fffc5e391f902ad37" exitCode=0 Apr 21 06:28:34.423087 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.422769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" event={"ID":"20805057-b0bc-4289-a705-2e946efadc98","Type":"ContainerDied","Data":"5a008cbcd9e20460a2cab6f31f2d24520be00cd07bdb272fffc5e391f902ad37"} Apr 21 06:28:34.531606 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.531580 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:28:34.674050 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.673959 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-bound-sa-token\") pod \"20805057-b0bc-4289-a705-2e946efadc98\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " Apr 21 06:28:34.674050 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.674017 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20805057-b0bc-4289-a705-2e946efadc98-ca-trust-extracted\") pod \"20805057-b0bc-4289-a705-2e946efadc98\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " Apr 21 06:28:34.674050 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.674052 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") pod \"20805057-b0bc-4289-a705-2e946efadc98\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " Apr 21 06:28:34.674264 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.674226 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpfhd\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-kube-api-access-xpfhd\") pod \"20805057-b0bc-4289-a705-2e946efadc98\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " Apr 21 06:28:34.674334 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.674316 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-image-registry-private-configuration\") pod \"20805057-b0bc-4289-a705-2e946efadc98\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " Apr 21 06:28:34.674384 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.674368 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-installation-pull-secrets\") pod \"20805057-b0bc-4289-a705-2e946efadc98\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " Apr 21 06:28:34.674438 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.674419 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-trusted-ca\") pod \"20805057-b0bc-4289-a705-2e946efadc98\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " Apr 21 06:28:34.674575 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.674448 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-registry-certificates\") pod \"20805057-b0bc-4289-a705-2e946efadc98\" (UID: \"20805057-b0bc-4289-a705-2e946efadc98\") " Apr 21 06:28:34.675187 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.675111 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "20805057-b0bc-4289-a705-2e946efadc98" (UID: "20805057-b0bc-4289-a705-2e946efadc98"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:34.675296 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.675217 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "20805057-b0bc-4289-a705-2e946efadc98" (UID: "20805057-b0bc-4289-a705-2e946efadc98"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:34.677007 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.676980 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "20805057-b0bc-4289-a705-2e946efadc98" (UID: "20805057-b0bc-4289-a705-2e946efadc98"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:28:34.677108 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.676999 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "20805057-b0bc-4289-a705-2e946efadc98" (UID: "20805057-b0bc-4289-a705-2e946efadc98"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:28:34.677108 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.677011 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "20805057-b0bc-4289-a705-2e946efadc98" (UID: "20805057-b0bc-4289-a705-2e946efadc98"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:28:34.677108 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.677080 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-kube-api-access-xpfhd" (OuterVolumeSpecName: "kube-api-access-xpfhd") pod "20805057-b0bc-4289-a705-2e946efadc98" (UID: "20805057-b0bc-4289-a705-2e946efadc98"). InnerVolumeSpecName "kube-api-access-xpfhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:28:34.677243 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.677156 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "20805057-b0bc-4289-a705-2e946efadc98" (UID: "20805057-b0bc-4289-a705-2e946efadc98"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:28:34.683150 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.683129 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20805057-b0bc-4289-a705-2e946efadc98-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "20805057-b0bc-4289-a705-2e946efadc98" (UID: "20805057-b0bc-4289-a705-2e946efadc98"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:28:34.775316 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.775275 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20805057-b0bc-4289-a705-2e946efadc98-ca-trust-extracted\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:34.775316 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.775311 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-registry-tls\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:34.775515 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.775324 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpfhd\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-kube-api-access-xpfhd\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:34.775515 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.775340 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-image-registry-private-configuration\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:34.775515 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.775352 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20805057-b0bc-4289-a705-2e946efadc98-installation-pull-secrets\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:34.775515 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.775364 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-trusted-ca\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:34.775515 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.775375 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20805057-b0bc-4289-a705-2e946efadc98-registry-certificates\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:34.775515 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:34.775386 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20805057-b0bc-4289-a705-2e946efadc98-bound-sa-token\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:35.426843 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:35.426803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" event={"ID":"20805057-b0bc-4289-a705-2e946efadc98","Type":"ContainerDied","Data":"13e26414c9c5e053be761d3cf2a21da8725f9e0fa249035e5de3d8e1ee5f7008"} Apr 21 06:28:35.427299 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:35.426853 2577 scope.go:117] "RemoveContainer" containerID="5a008cbcd9e20460a2cab6f31f2d24520be00cd07bdb272fffc5e391f902ad37" Apr 21 06:28:35.427299 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:35.426860 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7579d96757-p2wbq" Apr 21 06:28:35.447805 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:35.447779 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7579d96757-p2wbq"] Apr 21 06:28:35.456844 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:35.456823 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7579d96757-p2wbq"] Apr 21 06:28:35.936138 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:35.934991 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20805057-b0bc-4289-a705-2e946efadc98" path="/var/lib/kubelet/pods/20805057-b0bc-4289-a705-2e946efadc98/volumes" Apr 21 06:28:37.622057 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:37.622017 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:40.502121 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:40.502076 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:40.502562 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:40.502227 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:40.507084 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:40.507062 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:41.447801 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:41.447774 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68d755874-bt42h" Apr 21 06:28:45.683096 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:45.683062 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68d755874-bt42h"] Apr 21 06:28:45.774803 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:45.774764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:28:45.777335 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:45.777305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14257089-c0ac-4007-81fc-ff9a9034e71b-metrics-certs\") pod \"network-metrics-daemon-276tk\" (UID: \"14257089-c0ac-4007-81fc-ff9a9034e71b\") " pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:28:46.037893 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:46.037864 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p6jr2\"" Apr 21 06:28:46.046527 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:46.046507 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-276tk" Apr 21 06:28:46.177565 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:46.177469 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-276tk"] Apr 21 06:28:46.179853 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:28:46.179827 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14257089_c0ac_4007_81fc_ff9a9034e71b.slice/crio-d64b79712842b0428c1feb8cf881698520e6fc6ef7f8a908eff70ba239dd05b8 WatchSource:0}: Error finding container d64b79712842b0428c1feb8cf881698520e6fc6ef7f8a908eff70ba239dd05b8: Status 404 returned error can't find the container with id d64b79712842b0428c1feb8cf881698520e6fc6ef7f8a908eff70ba239dd05b8 Apr 21 06:28:46.462419 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:46.462333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-276tk" event={"ID":"14257089-c0ac-4007-81fc-ff9a9034e71b","Type":"ContainerStarted","Data":"d64b79712842b0428c1feb8cf881698520e6fc6ef7f8a908eff70ba239dd05b8"} Apr 21 06:28:47.466619 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:47.466586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-276tk" event={"ID":"14257089-c0ac-4007-81fc-ff9a9034e71b","Type":"ContainerStarted","Data":"f070a29b57d63c755f7b25fe1042820c95743a6e764125253ddebc86c218beb4"} Apr 21 06:28:47.466619 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:47.466624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-276tk" event={"ID":"14257089-c0ac-4007-81fc-ff9a9034e71b","Type":"ContainerStarted","Data":"525da00ae5b22368f9f7b66389f0b552ec14b32d538a5df9fdba12330a8254bc"} Apr 21 06:28:47.483235 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:47.483187 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-276tk" podStartSLOduration=130.579674735 podStartE2EDuration="2m11.483166816s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:28:46.181756879 +0000 UTC m=+130.834202683" lastFinishedPulling="2026-04-21 06:28:47.085248958 +0000 UTC m=+131.737694764" observedRunningTime="2026-04-21 06:28:47.48263948 +0000 UTC m=+132.135085298" watchObservedRunningTime="2026-04-21 06:28:47.483166816 +0000 UTC m=+132.135612637" Apr 21 06:28:51.479631 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:51.479591 2577 generic.go:358] "Generic (PLEG): container finished" podID="504b76b3-d116-4731-aca7-01cb1970de58" containerID="7bce394258ad157e5de24e5dfb555c72a548d4d9a2b61590f0f7c3ef0f887150" exitCode=0 Apr 21 06:28:51.480048 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:51.479640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" event={"ID":"504b76b3-d116-4731-aca7-01cb1970de58","Type":"ContainerDied","Data":"7bce394258ad157e5de24e5dfb555c72a548d4d9a2b61590f0f7c3ef0f887150"} Apr 21 06:28:51.480048 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:51.479975 2577 scope.go:117] "RemoveContainer" containerID="7bce394258ad157e5de24e5dfb555c72a548d4d9a2b61590f0f7c3ef0f887150" Apr 21 06:28:52.484549 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:52.484515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-j2sxp" event={"ID":"504b76b3-d116-4731-aca7-01cb1970de58","Type":"ContainerStarted","Data":"ccaeff0d1281e9f453f068d0766db7f5c53d1d89f2768503d6cd82dfa0e0a35d"} Apr 21 06:28:55.152193 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.152127 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68f6966bf9-flnb6" podUID="bda38237-5a37-4a09-ae06-dfa8f2dc1240" containerName="console" containerID="cri-o://20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604" gracePeriod=15 Apr 21 06:28:55.389883 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.389859 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f6966bf9-flnb6_bda38237-5a37-4a09-ae06-dfa8f2dc1240/console/0.log" Apr 21 06:28:55.390023 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.389918 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:55.457664 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.457575 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-oauth-serving-cert\") pod \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " Apr 21 06:28:55.457664 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.457621 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-oauth-config\") pod \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " Apr 21 06:28:55.457664 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.457663 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-service-ca\") pod \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " Apr 21 06:28:55.457965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.457682 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-serving-cert\") pod \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " Apr 21 06:28:55.457965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.457708 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-config\") pod \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " Apr 21 06:28:55.457965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.457836 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grzck\" (UniqueName: \"kubernetes.io/projected/bda38237-5a37-4a09-ae06-dfa8f2dc1240-kube-api-access-grzck\") pod \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\" (UID: \"bda38237-5a37-4a09-ae06-dfa8f2dc1240\") " Apr 21 06:28:55.458160 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.458134 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bda38237-5a37-4a09-ae06-dfa8f2dc1240" (UID: "bda38237-5a37-4a09-ae06-dfa8f2dc1240"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:55.458222 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.458139 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-service-ca" (OuterVolumeSpecName: "service-ca") pod "bda38237-5a37-4a09-ae06-dfa8f2dc1240" (UID: "bda38237-5a37-4a09-ae06-dfa8f2dc1240"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:55.458258 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.458233 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-config" (OuterVolumeSpecName: "console-config") pod "bda38237-5a37-4a09-ae06-dfa8f2dc1240" (UID: "bda38237-5a37-4a09-ae06-dfa8f2dc1240"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:55.460257 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.460234 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bda38237-5a37-4a09-ae06-dfa8f2dc1240" (UID: "bda38237-5a37-4a09-ae06-dfa8f2dc1240"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:28:55.460494 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.460474 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda38237-5a37-4a09-ae06-dfa8f2dc1240-kube-api-access-grzck" (OuterVolumeSpecName: "kube-api-access-grzck") pod "bda38237-5a37-4a09-ae06-dfa8f2dc1240" (UID: "bda38237-5a37-4a09-ae06-dfa8f2dc1240"). InnerVolumeSpecName "kube-api-access-grzck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:28:55.460536 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.460488 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bda38237-5a37-4a09-ae06-dfa8f2dc1240" (UID: "bda38237-5a37-4a09-ae06-dfa8f2dc1240"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:28:55.499420 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.499391 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68f6966bf9-flnb6_bda38237-5a37-4a09-ae06-dfa8f2dc1240/console/0.log" Apr 21 06:28:55.499610 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.499432 2577 generic.go:358] "Generic (PLEG): container finished" podID="bda38237-5a37-4a09-ae06-dfa8f2dc1240" containerID="20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604" exitCode=2 Apr 21 06:28:55.499610 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.499464 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f6966bf9-flnb6" event={"ID":"bda38237-5a37-4a09-ae06-dfa8f2dc1240","Type":"ContainerDied","Data":"20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604"} Apr 21 06:28:55.499610 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.499508 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68f6966bf9-flnb6" event={"ID":"bda38237-5a37-4a09-ae06-dfa8f2dc1240","Type":"ContainerDied","Data":"aaab1c103a755fab8eeae88ddb2e6f0c4c7e13e65bef422132afc31b54ae979c"} Apr 21 06:28:55.499610 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.499517 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68f6966bf9-flnb6" Apr 21 06:28:55.499610 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.499527 2577 scope.go:117] "RemoveContainer" containerID="20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604" Apr 21 06:28:55.509741 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.509634 2577 scope.go:117] "RemoveContainer" containerID="20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604" Apr 21 06:28:55.510186 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:28:55.510158 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604\": container with ID starting with 20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604 not found: ID does not exist" containerID="20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604" Apr 21 06:28:55.510266 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.510198 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604"} err="failed to get container status \"20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604\": rpc error: code = NotFound desc = could not find container \"20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604\": container with ID starting with 20313b9016a2a96495eaf42983412eb6530ba2ec622a0aa5c80619278be1c604 not found: ID does not exist" Apr 21 06:28:55.521487 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.521444 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68f6966bf9-flnb6"] Apr 21 06:28:55.528671 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.528649 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68f6966bf9-flnb6"] Apr 21 06:28:55.558712 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.558680 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grzck\" (UniqueName: \"kubernetes.io/projected/bda38237-5a37-4a09-ae06-dfa8f2dc1240-kube-api-access-grzck\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:55.558712 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.558711 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-oauth-serving-cert\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:55.558897 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.558721 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-oauth-config\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:55.558897 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.558751 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-service-ca\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:55.558897 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.558760 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-serving-cert\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:55.558897 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.558768 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda38237-5a37-4a09-ae06-dfa8f2dc1240-console-config\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:55.930001 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:55.929963 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda38237-5a37-4a09-ae06-dfa8f2dc1240" path="/var/lib/kubelet/pods/bda38237-5a37-4a09-ae06-dfa8f2dc1240/volumes" Apr 21 06:28:59.421675 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.421616 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86974fd9db-9pnxc" podUID="c7c85c7b-7eef-408f-8525-051838d8f407" containerName="console" containerID="cri-o://775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1" gracePeriod=15 Apr 21 06:28:59.661741 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.661700 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86974fd9db-9pnxc_c7c85c7b-7eef-408f-8525-051838d8f407/console/0.log" Apr 21 06:28:59.661869 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.661778 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:28:59.687397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687332 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-console-config\") pod \"c7c85c7b-7eef-408f-8525-051838d8f407\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " Apr 21 06:28:59.687397 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687383 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-service-ca\") pod \"c7c85c7b-7eef-408f-8525-051838d8f407\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " Apr 21 06:28:59.687626 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687417 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-oauth-serving-cert\") pod \"c7c85c7b-7eef-408f-8525-051838d8f407\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " Apr 21 06:28:59.687626 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687433 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-trusted-ca-bundle\") pod \"c7c85c7b-7eef-408f-8525-051838d8f407\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " Apr 21 06:28:59.687626 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687451 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-oauth-config\") pod \"c7c85c7b-7eef-408f-8525-051838d8f407\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " Apr 21 06:28:59.687813 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687789 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-serving-cert\") pod \"c7c85c7b-7eef-408f-8525-051838d8f407\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " Apr 21 06:28:59.687873 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687814 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-console-config" (OuterVolumeSpecName: "console-config") pod "c7c85c7b-7eef-408f-8525-051838d8f407" (UID: "c7c85c7b-7eef-408f-8525-051838d8f407"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:59.687873 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687848 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxwvh\" (UniqueName: \"kubernetes.io/projected/c7c85c7b-7eef-408f-8525-051838d8f407-kube-api-access-rxwvh\") pod \"c7c85c7b-7eef-408f-8525-051838d8f407\" (UID: \"c7c85c7b-7eef-408f-8525-051838d8f407\") " Apr 21 06:28:59.687976 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687951 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c7c85c7b-7eef-408f-8525-051838d8f407" (UID: "c7c85c7b-7eef-408f-8525-051838d8f407"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:59.687976 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.687963 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-service-ca" (OuterVolumeSpecName: "service-ca") pod "c7c85c7b-7eef-408f-8525-051838d8f407" (UID: "c7c85c7b-7eef-408f-8525-051838d8f407"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:59.688130 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.688111 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-console-config\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:59.688208 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.688136 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-service-ca\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:59.688208 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.688150 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-oauth-serving-cert\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:59.688286 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.688244 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c7c85c7b-7eef-408f-8525-051838d8f407" (UID: "c7c85c7b-7eef-408f-8525-051838d8f407"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:59.690045 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.690023 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c85c7b-7eef-408f-8525-051838d8f407-kube-api-access-rxwvh" (OuterVolumeSpecName: "kube-api-access-rxwvh") pod "c7c85c7b-7eef-408f-8525-051838d8f407" (UID: "c7c85c7b-7eef-408f-8525-051838d8f407"). InnerVolumeSpecName "kube-api-access-rxwvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:28:59.690406 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.690381 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c7c85c7b-7eef-408f-8525-051838d8f407" (UID: "c7c85c7b-7eef-408f-8525-051838d8f407"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:28:59.690493 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.690409 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c7c85c7b-7eef-408f-8525-051838d8f407" (UID: "c7c85c7b-7eef-408f-8525-051838d8f407"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:28:59.788965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.788929 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-serving-cert\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:59.788965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.788960 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxwvh\" (UniqueName: \"kubernetes.io/projected/c7c85c7b-7eef-408f-8525-051838d8f407-kube-api-access-rxwvh\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:59.788965 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.788970 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c85c7b-7eef-408f-8525-051838d8f407-trusted-ca-bundle\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:28:59.789191 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:28:59.788979 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7c85c7b-7eef-408f-8525-051838d8f407-console-oauth-config\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:29:00.515278 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.515245 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86974fd9db-9pnxc_c7c85c7b-7eef-408f-8525-051838d8f407/console/0.log" Apr 21 06:29:00.515688 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.515286 2577 generic.go:358] "Generic (PLEG): container finished" podID="c7c85c7b-7eef-408f-8525-051838d8f407" containerID="775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1" exitCode=2 Apr 21 06:29:00.515688 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.515319 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86974fd9db-9pnxc" event={"ID":"c7c85c7b-7eef-408f-8525-051838d8f407","Type":"ContainerDied","Data":"775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1"} Apr 21 06:29:00.515688 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.515342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86974fd9db-9pnxc" event={"ID":"c7c85c7b-7eef-408f-8525-051838d8f407","Type":"ContainerDied","Data":"8fa48c3b6e967f2fa3c4454d7312135ee70c59991c956afd8a883918a9e5276b"} Apr 21 06:29:00.515688 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.515356 2577 scope.go:117] "RemoveContainer" containerID="775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1" Apr 21 06:29:00.515688 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.515359 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86974fd9db-9pnxc" Apr 21 06:29:00.527645 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.527625 2577 scope.go:117] "RemoveContainer" containerID="775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1" Apr 21 06:29:00.527940 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:29:00.527918 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1\": container with ID starting with 775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1 not found: ID does not exist" containerID="775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1" Apr 21 06:29:00.527985 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.527950 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1"} err="failed to get container status \"775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1\": rpc error: code = NotFound desc = could not find container \"775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1\": container with ID starting with 775c6e5fdb448ef728579cc3d77d303f749ed231ea1e47a3947d2d32352b5cd1 not found: ID does not exist" Apr 21 06:29:00.530881 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.530856 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86974fd9db-9pnxc"] Apr 21 06:29:00.534223 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:00.534201 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86974fd9db-9pnxc"] Apr 21 06:29:01.929745 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:01.929649 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c85c7b-7eef-408f-8525-051838d8f407" path="/var/lib/kubelet/pods/c7c85c7b-7eef-408f-8525-051838d8f407/volumes" Apr 21 06:29:02.522875 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:02.522842 2577 generic.go:358] "Generic (PLEG): container finished" podID="fb78ba44-67d7-4a52-b661-9a1c6e9c6b38" containerID="5cc20b7e998117798c3045958aea75f18d6c4fbf6df0c8d5569b32f73bf17f95" exitCode=0 Apr 21 06:29:02.523043 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:02.522915 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5hfwt" event={"ID":"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38","Type":"ContainerDied","Data":"5cc20b7e998117798c3045958aea75f18d6c4fbf6df0c8d5569b32f73bf17f95"} Apr 21 06:29:02.523292 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:02.523269 2577 scope.go:117] "RemoveContainer" containerID="5cc20b7e998117798c3045958aea75f18d6c4fbf6df0c8d5569b32f73bf17f95" Apr 21 06:29:03.100225 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:03.100164 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5968f9cfc4-wxd8m_d743fd44-3762-47ee-9a4c-617f122ba333/router/0.log" Apr 21 06:29:03.106016 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:03.105990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2dlsg_3336a9c5-62bd-44a2-8149-ccbdebfdb50a/serve-healthcheck-canary/0.log" Apr 21 06:29:03.527362 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:03.527326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5hfwt" event={"ID":"fb78ba44-67d7-4a52-b661-9a1c6e9c6b38","Type":"ContainerStarted","Data":"9387e02ddabfd193b7f44ecd75b778b585cb87b62c301bce52f815c115fc0d27"} Apr 21 06:29:10.707372 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:10.707313 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68d755874-bt42h" podUID="be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" containerName="console" containerID="cri-o://12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721" gracePeriod=15 Apr 21 06:29:10.948801 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:10.948780 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68d755874-bt42h_be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b/console/0.log" Apr 21 06:29:10.948899 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:10.948853 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d755874-bt42h" Apr 21 06:29:11.063182 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063147 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-serving-cert\") pod \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " Apr 21 06:29:11.063357 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063201 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-service-ca\") pod \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " Apr 21 06:29:11.063357 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063229 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jl8w\" (UniqueName: \"kubernetes.io/projected/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-kube-api-access-7jl8w\") pod \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " Apr 21 06:29:11.063357 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063253 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-trusted-ca-bundle\") pod \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " Apr 21 06:29:11.063357 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063269 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-oauth-config\") pod \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " Apr 21 06:29:11.063357 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063288 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-oauth-serving-cert\") pod \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " Apr 21 06:29:11.063357 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063324 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-config\") pod \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\" (UID: \"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b\") " Apr 21 06:29:11.063734 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063704 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" (UID: "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:11.063864 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063697 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" (UID: "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:11.064096 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063865 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-config" (OuterVolumeSpecName: "console-config") pod "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" (UID: "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:11.064096 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.063940 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-service-ca" (OuterVolumeSpecName: "service-ca") pod "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" (UID: "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:11.065627 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.065600 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-kube-api-access-7jl8w" (OuterVolumeSpecName: "kube-api-access-7jl8w") pod "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" (UID: "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b"). InnerVolumeSpecName "kube-api-access-7jl8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:29:11.065997 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.065977 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" (UID: "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:11.066072 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.065993 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" (UID: "be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:11.164540 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.164499 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-serving-cert\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:29:11.164540 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.164534 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-service-ca\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:29:11.164540 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.164548 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jl8w\" (UniqueName: \"kubernetes.io/projected/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-kube-api-access-7jl8w\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:29:11.164828 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.164560 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-trusted-ca-bundle\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:29:11.164828 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.164573 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-oauth-config\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:29:11.164828 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.164593 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-oauth-serving-cert\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:29:11.164828 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.164605 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b-console-config\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:29:11.550593 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.550568 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68d755874-bt42h_be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b/console/0.log" Apr 21 06:29:11.550781 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.550610 2577 generic.go:358] "Generic (PLEG): container finished" podID="be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" containerID="12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721" exitCode=2 Apr 21 06:29:11.550781 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.550677 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68d755874-bt42h" Apr 21 06:29:11.550781 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.550690 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d755874-bt42h" event={"ID":"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b","Type":"ContainerDied","Data":"12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721"} Apr 21 06:29:11.550781 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.550742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68d755874-bt42h" event={"ID":"be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b","Type":"ContainerDied","Data":"328d95c02019418451b45889a64025afddec3cbda7d07ed9c75f712c3721ec18"} Apr 21 06:29:11.550781 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.550758 2577 scope.go:117] "RemoveContainer" containerID="12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721" Apr 21 06:29:11.563348 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.563329 2577 scope.go:117] "RemoveContainer" containerID="12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721" Apr 21 06:29:11.563589 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:29:11.563569 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721\": container with ID starting with 12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721 not found: ID does not exist" containerID="12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721" Apr 21 06:29:11.563639 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.563597 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721"} err="failed to get container status \"12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721\": rpc error: code = NotFound desc = could not find container \"12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721\": container with ID starting with 12c72bdc23392b23caa8a2473ae251240828aff327dc318901703f2f7ca13721 not found: ID does not exist" Apr 21 06:29:11.572996 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.572973 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68d755874-bt42h"] Apr 21 06:29:11.579217 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.579193 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68d755874-bt42h"] Apr 21 06:29:11.930159 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:11.930081 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" path="/var/lib/kubelet/pods/be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b/volumes" Apr 21 06:29:47.505646 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505606 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-769674575b-kbdbh"] Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505906 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bda38237-5a37-4a09-ae06-dfa8f2dc1240" containerName="console" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505919 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda38237-5a37-4a09-ae06-dfa8f2dc1240" containerName="console" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505934 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7c85c7b-7eef-408f-8525-051838d8f407" containerName="console" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505939 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c85c7b-7eef-408f-8525-051838d8f407" containerName="console" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505945 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20805057-b0bc-4289-a705-2e946efadc98" containerName="registry" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505950 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="20805057-b0bc-4289-a705-2e946efadc98" containerName="registry" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505959 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" containerName="console" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.505964 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" containerName="console" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.506008 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="be505ae2-2ddf-4cf9-a5ae-fa6b5e9f279b" containerName="console" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.506017 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="20805057-b0bc-4289-a705-2e946efadc98" containerName="registry" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.506023 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="bda38237-5a37-4a09-ae06-dfa8f2dc1240" containerName="console" Apr 21 06:29:47.506261 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.506030 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7c85c7b-7eef-408f-8525-051838d8f407" containerName="console" Apr 21 06:29:47.508978 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.508950 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.511402 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.511376 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 06:29:47.512046 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.512017 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 06:29:47.512182 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.512056 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 06:29:47.512182 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.512024 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 06:29:47.512182 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.512167 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 06:29:47.512382 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.512273 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 06:29:47.512382 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.512276 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-rk6ld\"" Apr 21 06:29:47.512520 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.512411 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 06:29:47.515770 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.515747 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 06:29:47.520497 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.520474 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-769674575b-kbdbh"] Apr 21 06:29:47.659973 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.659940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-trusted-ca-bundle\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.660142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.659982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-oauth-config\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.660142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.660007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-config\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.660142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.660085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-oauth-serving-cert\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.660273 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.660143 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kkpt\" (UniqueName: \"kubernetes.io/projected/b8cf4192-47a8-48b0-bcd2-b95b68a30478-kube-api-access-5kkpt\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.660273 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.660165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-serving-cert\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.660273 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.660261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-service-ca\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.761098 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.761006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-service-ca\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.761098 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.761053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-trusted-ca-bundle\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.761098 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.761087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-oauth-config\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.761371 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.761112 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-config\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.761371 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.761135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-oauth-serving-cert\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.761371 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.761158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kkpt\" (UniqueName: \"kubernetes.io/projected/b8cf4192-47a8-48b0-bcd2-b95b68a30478-kube-api-access-5kkpt\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.761371 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.761187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-serving-cert\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.761840 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.761815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-service-ca\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.762139 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.762064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-config\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.762187 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.762134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-trusted-ca-bundle\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.762187 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.762160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-oauth-serving-cert\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.763774 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.763753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-serving-cert\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.763900 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.763880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-oauth-config\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.769185 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.769157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kkpt\" (UniqueName: \"kubernetes.io/projected/b8cf4192-47a8-48b0-bcd2-b95b68a30478-kube-api-access-5kkpt\") pod \"console-769674575b-kbdbh\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.820869 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.820835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:47.955144 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:47.955107 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-769674575b-kbdbh"] Apr 21 06:29:47.958559 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:29:47.958515 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8cf4192_47a8_48b0_bcd2_b95b68a30478.slice/crio-887ce6b587c5a842ddfed1815b58efb402c556f87301f1cf13d250a5b31cd00b WatchSource:0}: Error finding container 887ce6b587c5a842ddfed1815b58efb402c556f87301f1cf13d250a5b31cd00b: Status 404 returned error can't find the container with id 887ce6b587c5a842ddfed1815b58efb402c556f87301f1cf13d250a5b31cd00b Apr 21 06:29:48.655498 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:48.655454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769674575b-kbdbh" event={"ID":"b8cf4192-47a8-48b0-bcd2-b95b68a30478","Type":"ContainerStarted","Data":"edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7"} Apr 21 06:29:48.655498 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:48.655501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769674575b-kbdbh" event={"ID":"b8cf4192-47a8-48b0-bcd2-b95b68a30478","Type":"ContainerStarted","Data":"887ce6b587c5a842ddfed1815b58efb402c556f87301f1cf13d250a5b31cd00b"} Apr 21 06:29:48.670808 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:48.670755 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-769674575b-kbdbh" podStartSLOduration=1.670720142 podStartE2EDuration="1.670720142s" podCreationTimestamp="2026-04-21 06:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:29:48.669879718 +0000 UTC m=+193.322325537" watchObservedRunningTime="2026-04-21 06:29:48.670720142 +0000 UTC m=+193.323165964" Apr 21 06:29:57.821340 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:57.821290 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:57.821967 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:57.821670 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:57.826274 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:57.826250 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:29:58.688122 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:29:58.688094 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:30:34.660405 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.660370 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vd2jf"] Apr 21 06:30:34.662845 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.662822 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.665360 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.665341 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 06:30:34.672427 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.672404 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vd2jf"] Apr 21 06:30:34.728095 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.728064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e87fa991-449a-49d9-be91-c70826350171-original-pull-secret\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.728095 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.728105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e87fa991-449a-49d9-be91-c70826350171-dbus\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.728305 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.728138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e87fa991-449a-49d9-be91-c70826350171-kubelet-config\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.829255 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.829212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e87fa991-449a-49d9-be91-c70826350171-original-pull-secret\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.829255 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.829259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e87fa991-449a-49d9-be91-c70826350171-dbus\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.829531 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.829375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e87fa991-449a-49d9-be91-c70826350171-kubelet-config\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.829531 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.829418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e87fa991-449a-49d9-be91-c70826350171-dbus\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.829531 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.829478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e87fa991-449a-49d9-be91-c70826350171-kubelet-config\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.831611 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.831594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e87fa991-449a-49d9-be91-c70826350171-original-pull-secret\") pod \"global-pull-secret-syncer-vd2jf\" (UID: \"e87fa991-449a-49d9-be91-c70826350171\") " pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:34.972402 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:34.972368 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vd2jf" Apr 21 06:30:35.093778 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:35.093753 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vd2jf"] Apr 21 06:30:35.096401 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:30:35.096372 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87fa991_449a_49d9_be91_c70826350171.slice/crio-bda92d8153d223af82cebe1555e523e0ca0a730b85a28e93436a05be2728afc1 WatchSource:0}: Error finding container bda92d8153d223af82cebe1555e523e0ca0a730b85a28e93436a05be2728afc1: Status 404 returned error can't find the container with id bda92d8153d223af82cebe1555e523e0ca0a730b85a28e93436a05be2728afc1 Apr 21 06:30:35.787696 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:35.787652 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vd2jf" event={"ID":"e87fa991-449a-49d9-be91-c70826350171","Type":"ContainerStarted","Data":"bda92d8153d223af82cebe1555e523e0ca0a730b85a28e93436a05be2728afc1"} Apr 21 06:30:39.806090 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:39.806044 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vd2jf" event={"ID":"e87fa991-449a-49d9-be91-c70826350171","Type":"ContainerStarted","Data":"05654a289d7e103c82b2aa2d1d33cbe5dfae89b33ee00e497ef26209e10457d0"} Apr 21 06:30:39.820058 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:30:39.820001 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vd2jf" podStartSLOduration=2.015922091 podStartE2EDuration="5.819986166s" podCreationTimestamp="2026-04-21 06:30:34 +0000 UTC" firstStartedPulling="2026-04-21 06:30:35.098057672 +0000 UTC m=+239.750503469" lastFinishedPulling="2026-04-21 06:30:38.902121746 +0000 UTC m=+243.554567544" observedRunningTime="2026-04-21 06:30:39.818963706 +0000 UTC m=+244.471409522" watchObservedRunningTime="2026-04-21 06:30:39.819986166 +0000 UTC m=+244.472431985" Apr 21 06:31:15.828663 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.828624 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj"] Apr 21 06:31:15.832797 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.832779 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:15.835137 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.835116 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 06:31:15.835254 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.835146 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 06:31:15.835721 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.835706 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lqhdj\"" Apr 21 06:31:15.841395 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.841373 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj"] Apr 21 06:31:15.969773 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.969707 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:15.969953 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.969820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2hrb\" (UniqueName: \"kubernetes.io/projected/ffe66563-f6a0-4586-ab77-45c97c0286ad-kube-api-access-z2hrb\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:15.969953 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:15.969881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:16.070302 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.070266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:16.070473 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.070321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2hrb\" (UniqueName: \"kubernetes.io/projected/ffe66563-f6a0-4586-ab77-45c97c0286ad-kube-api-access-z2hrb\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:16.070473 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.070364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:16.070653 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.070635 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:16.070710 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.070691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:16.078370 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.078344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2hrb\" (UniqueName: \"kubernetes.io/projected/ffe66563-f6a0-4586-ab77-45c97c0286ad-kube-api-access-z2hrb\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:16.142389 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.142312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:16.265106 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.265082 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj"] Apr 21 06:31:16.267479 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:31:16.267450 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffe66563_f6a0_4586_ab77_45c97c0286ad.slice/crio-65aa9319e07ab850dd44edcbf715a8ed4c2999cabc6ab3c5582973cf200a871c WatchSource:0}: Error finding container 65aa9319e07ab850dd44edcbf715a8ed4c2999cabc6ab3c5582973cf200a871c: Status 404 returned error can't find the container with id 65aa9319e07ab850dd44edcbf715a8ed4c2999cabc6ab3c5582973cf200a871c Apr 21 06:31:16.909424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:16.909382 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" event={"ID":"ffe66563-f6a0-4586-ab77-45c97c0286ad","Type":"ContainerStarted","Data":"65aa9319e07ab850dd44edcbf715a8ed4c2999cabc6ab3c5582973cf200a871c"} Apr 21 06:31:21.928317 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:21.928283 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerID="9b5001d4fdb51407d89c6141a56d6b9ddf6db6f1533ecf1e80cd330c4e8ff63d" exitCode=0 Apr 21 06:31:21.929243 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:21.929129 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" event={"ID":"ffe66563-f6a0-4586-ab77-45c97c0286ad","Type":"ContainerDied","Data":"9b5001d4fdb51407d89c6141a56d6b9ddf6db6f1533ecf1e80cd330c4e8ff63d"} Apr 21 06:31:23.936169 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:23.936132 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerID="c0216c4919fed31bc101c323c7d76ae535719baab65e99ae03d438227bd5f460" exitCode=0 Apr 21 06:31:23.936549 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:23.936227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" event={"ID":"ffe66563-f6a0-4586-ab77-45c97c0286ad","Type":"ContainerDied","Data":"c0216c4919fed31bc101c323c7d76ae535719baab65e99ae03d438227bd5f460"} Apr 21 06:31:30.959135 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:30.959094 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerID="75e8df25af56538e72549a5e005ef9210e8536adeeba3194eac0d88a8a2f2aeb" exitCode=0 Apr 21 06:31:30.959603 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:30.959163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" event={"ID":"ffe66563-f6a0-4586-ab77-45c97c0286ad","Type":"ContainerDied","Data":"75e8df25af56538e72549a5e005ef9210e8536adeeba3194eac0d88a8a2f2aeb"} Apr 21 06:31:32.082788 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.082765 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:32.104641 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.104614 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-bundle\") pod \"ffe66563-f6a0-4586-ab77-45c97c0286ad\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " Apr 21 06:31:32.104813 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.104684 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-util\") pod \"ffe66563-f6a0-4586-ab77-45c97c0286ad\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " Apr 21 06:31:32.104813 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.104721 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2hrb\" (UniqueName: \"kubernetes.io/projected/ffe66563-f6a0-4586-ab77-45c97c0286ad-kube-api-access-z2hrb\") pod \"ffe66563-f6a0-4586-ab77-45c97c0286ad\" (UID: \"ffe66563-f6a0-4586-ab77-45c97c0286ad\") " Apr 21 06:31:32.105502 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.105473 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-bundle" (OuterVolumeSpecName: "bundle") pod "ffe66563-f6a0-4586-ab77-45c97c0286ad" (UID: "ffe66563-f6a0-4586-ab77-45c97c0286ad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:31:32.106935 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.106906 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe66563-f6a0-4586-ab77-45c97c0286ad-kube-api-access-z2hrb" (OuterVolumeSpecName: "kube-api-access-z2hrb") pod "ffe66563-f6a0-4586-ab77-45c97c0286ad" (UID: "ffe66563-f6a0-4586-ab77-45c97c0286ad"). InnerVolumeSpecName "kube-api-access-z2hrb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:31:32.110225 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.110194 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-util" (OuterVolumeSpecName: "util") pod "ffe66563-f6a0-4586-ab77-45c97c0286ad" (UID: "ffe66563-f6a0-4586-ab77-45c97c0286ad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:31:32.206342 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.206303 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-bundle\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:31:32.206342 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.206333 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffe66563-f6a0-4586-ab77-45c97c0286ad-util\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:31:32.206342 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.206342 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z2hrb\" (UniqueName: \"kubernetes.io/projected/ffe66563-f6a0-4586-ab77-45c97c0286ad-kube-api-access-z2hrb\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:31:32.966533 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.966497 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" event={"ID":"ffe66563-f6a0-4586-ab77-45c97c0286ad","Type":"ContainerDied","Data":"65aa9319e07ab850dd44edcbf715a8ed4c2999cabc6ab3c5582973cf200a871c"} Apr 21 06:31:32.966533 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.966531 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65aa9319e07ab850dd44edcbf715a8ed4c2999cabc6ab3c5582973cf200a871c" Apr 21 06:31:32.966533 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:32.966535 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dzhggj" Apr 21 06:31:35.802861 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:35.802827 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/1.log" Apr 21 06:31:35.803544 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:35.803522 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/1.log" Apr 21 06:31:35.809644 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:35.809618 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:31:35.809983 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:35.809965 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:31:35.812522 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:35.812503 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 06:31:38.768101 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.768070 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g"] Apr 21 06:31:38.815582 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.768373 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerName="pull" Apr 21 06:31:38.815582 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.768383 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerName="pull" Apr 21 06:31:38.815582 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.768392 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerName="util" Apr 21 06:31:38.815582 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.768398 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerName="util" Apr 21 06:31:38.815582 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.768404 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerName="extract" Apr 21 06:31:38.815582 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.768409 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerName="extract" Apr 21 06:31:38.815582 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.768455 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ffe66563-f6a0-4586-ab77-45c97c0286ad" containerName="extract" Apr 21 06:31:38.822826 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.822788 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g"] Apr 21 06:31:38.822942 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.822854 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" Apr 21 06:31:38.825282 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.825261 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:31:38.825391 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.825300 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 06:31:38.825391 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.825353 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-mflvq\"" Apr 21 06:31:38.856781 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.856705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c69f881-40fd-4578-9d8b-d4a560001b87-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-zs76g\" (UID: \"6c69f881-40fd-4578-9d8b-d4a560001b87\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" Apr 21 06:31:38.856956 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.856924 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgmrd\" (UniqueName: \"kubernetes.io/projected/6c69f881-40fd-4578-9d8b-d4a560001b87-kube-api-access-lgmrd\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-zs76g\" (UID: \"6c69f881-40fd-4578-9d8b-d4a560001b87\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" Apr 21 06:31:38.957765 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.957708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c69f881-40fd-4578-9d8b-d4a560001b87-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-zs76g\" (UID: \"6c69f881-40fd-4578-9d8b-d4a560001b87\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" Apr 21 06:31:38.957885 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.957814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgmrd\" (UniqueName: \"kubernetes.io/projected/6c69f881-40fd-4578-9d8b-d4a560001b87-kube-api-access-lgmrd\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-zs76g\" (UID: \"6c69f881-40fd-4578-9d8b-d4a560001b87\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" Apr 21 06:31:38.958095 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.958072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c69f881-40fd-4578-9d8b-d4a560001b87-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-zs76g\" (UID: \"6c69f881-40fd-4578-9d8b-d4a560001b87\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" Apr 21 06:31:38.966036 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:38.965991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgmrd\" (UniqueName: \"kubernetes.io/projected/6c69f881-40fd-4578-9d8b-d4a560001b87-kube-api-access-lgmrd\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-zs76g\" (UID: \"6c69f881-40fd-4578-9d8b-d4a560001b87\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" Apr 21 06:31:39.131980 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:39.131891 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" Apr 21 06:31:39.258043 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:39.258015 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g"] Apr 21 06:31:39.260432 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:31:39.260404 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c69f881_40fd_4578_9d8b_d4a560001b87.slice/crio-8a6d79525782f1d53feed1af8a04669df006322e43bb055400e63e90a21b1870 WatchSource:0}: Error finding container 8a6d79525782f1d53feed1af8a04669df006322e43bb055400e63e90a21b1870: Status 404 returned error can't find the container with id 8a6d79525782f1d53feed1af8a04669df006322e43bb055400e63e90a21b1870 Apr 21 06:31:39.262962 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:39.262945 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:31:39.988787 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:39.988743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" event={"ID":"6c69f881-40fd-4578-9d8b-d4a560001b87","Type":"ContainerStarted","Data":"8a6d79525782f1d53feed1af8a04669df006322e43bb055400e63e90a21b1870"} Apr 21 06:31:41.996956 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:41.996916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" event={"ID":"6c69f881-40fd-4578-9d8b-d4a560001b87","Type":"ContainerStarted","Data":"09d663ae43b9e0fcaa2a1597d46d76c5588e944ba7e00f13e870856c29da63a1"} Apr 21 06:31:42.015140 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:42.015088 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-zs76g" podStartSLOduration=1.794322211 podStartE2EDuration="4.015071623s" podCreationTimestamp="2026-04-21 06:31:38 +0000 UTC" firstStartedPulling="2026-04-21 06:31:39.263070078 +0000 UTC m=+303.915515875" lastFinishedPulling="2026-04-21 06:31:41.48381949 +0000 UTC m=+306.136265287" observedRunningTime="2026-04-21 06:31:42.013623105 +0000 UTC m=+306.666068960" watchObservedRunningTime="2026-04-21 06:31:42.015071623 +0000 UTC m=+306.667517443" Apr 21 06:31:45.183989 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.183958 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-dsglc"] Apr 21 06:31:45.188183 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.188166 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:45.190481 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.190456 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 06:31:45.190604 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.190533 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-czl5m\"" Apr 21 06:31:45.191293 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.191272 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 06:31:45.194034 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.194015 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-dsglc"] Apr 21 06:31:45.310884 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.310848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ed58049-7535-4707-b734-1b452c05aa7a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-dsglc\" (UID: \"4ed58049-7535-4707-b734-1b452c05aa7a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:45.311052 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.310894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqjnn\" (UniqueName: \"kubernetes.io/projected/4ed58049-7535-4707-b734-1b452c05aa7a-kube-api-access-nqjnn\") pod \"cert-manager-webhook-587ccfb98-dsglc\" (UID: \"4ed58049-7535-4707-b734-1b452c05aa7a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:45.411481 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.411441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ed58049-7535-4707-b734-1b452c05aa7a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-dsglc\" (UID: \"4ed58049-7535-4707-b734-1b452c05aa7a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:45.411682 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.411495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqjnn\" (UniqueName: \"kubernetes.io/projected/4ed58049-7535-4707-b734-1b452c05aa7a-kube-api-access-nqjnn\") pod \"cert-manager-webhook-587ccfb98-dsglc\" (UID: \"4ed58049-7535-4707-b734-1b452c05aa7a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:45.419451 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.419418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqjnn\" (UniqueName: \"kubernetes.io/projected/4ed58049-7535-4707-b734-1b452c05aa7a-kube-api-access-nqjnn\") pod \"cert-manager-webhook-587ccfb98-dsglc\" (UID: \"4ed58049-7535-4707-b734-1b452c05aa7a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:45.419587 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.419493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ed58049-7535-4707-b734-1b452c05aa7a-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-dsglc\" (UID: \"4ed58049-7535-4707-b734-1b452c05aa7a\") " pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:45.506481 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.506444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:45.629452 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:45.629426 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-dsglc"] Apr 21 06:31:45.631876 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:31:45.631848 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ed58049_7535_4707_b734_1b452c05aa7a.slice/crio-787faaf4805e05125cf63f25ff1419ec6333f4a91ef3a79c6f02dd8ea00be7e1 WatchSource:0}: Error finding container 787faaf4805e05125cf63f25ff1419ec6333f4a91ef3a79c6f02dd8ea00be7e1: Status 404 returned error can't find the container with id 787faaf4805e05125cf63f25ff1419ec6333f4a91ef3a79c6f02dd8ea00be7e1 Apr 21 06:31:46.011006 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.010972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" event={"ID":"4ed58049-7535-4707-b734-1b452c05aa7a","Type":"ContainerStarted","Data":"787faaf4805e05125cf63f25ff1419ec6333f4a91ef3a79c6f02dd8ea00be7e1"} Apr 21 06:31:46.134144 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.134108 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-pxdpq"] Apr 21 06:31:46.138489 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.138469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" Apr 21 06:31:46.140718 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.140694 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-xkxwn\"" Apr 21 06:31:46.145441 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.145388 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-pxdpq"] Apr 21 06:31:46.219694 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.219656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20bda56d-e2bc-4588-bd8b-71f9fbf07dce-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-pxdpq\" (UID: \"20bda56d-e2bc-4588-bd8b-71f9fbf07dce\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" Apr 21 06:31:46.220116 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.219722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s7zr\" (UniqueName: \"kubernetes.io/projected/20bda56d-e2bc-4588-bd8b-71f9fbf07dce-kube-api-access-6s7zr\") pod \"cert-manager-cainjector-68b757865b-pxdpq\" (UID: \"20bda56d-e2bc-4588-bd8b-71f9fbf07dce\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" Apr 21 06:31:46.320896 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.320801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20bda56d-e2bc-4588-bd8b-71f9fbf07dce-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-pxdpq\" (UID: \"20bda56d-e2bc-4588-bd8b-71f9fbf07dce\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" Apr 21 06:31:46.320896 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.320863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s7zr\" (UniqueName: \"kubernetes.io/projected/20bda56d-e2bc-4588-bd8b-71f9fbf07dce-kube-api-access-6s7zr\") pod \"cert-manager-cainjector-68b757865b-pxdpq\" (UID: \"20bda56d-e2bc-4588-bd8b-71f9fbf07dce\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" Apr 21 06:31:46.328374 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.328338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20bda56d-e2bc-4588-bd8b-71f9fbf07dce-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-pxdpq\" (UID: \"20bda56d-e2bc-4588-bd8b-71f9fbf07dce\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" Apr 21 06:31:46.328518 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.328429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s7zr\" (UniqueName: \"kubernetes.io/projected/20bda56d-e2bc-4588-bd8b-71f9fbf07dce-kube-api-access-6s7zr\") pod \"cert-manager-cainjector-68b757865b-pxdpq\" (UID: \"20bda56d-e2bc-4588-bd8b-71f9fbf07dce\") " pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" Apr 21 06:31:46.449008 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.448968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" Apr 21 06:31:46.593376 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:46.593346 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-pxdpq"] Apr 21 06:31:46.600092 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:31:46.597535 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20bda56d_e2bc_4588_bd8b_71f9fbf07dce.slice/crio-d4582f2b46374411ce668b8a7e0b351cb84af827832f2ff5bbf766ab69080837 WatchSource:0}: Error finding container d4582f2b46374411ce668b8a7e0b351cb84af827832f2ff5bbf766ab69080837: Status 404 returned error can't find the container with id d4582f2b46374411ce668b8a7e0b351cb84af827832f2ff5bbf766ab69080837 Apr 21 06:31:47.015353 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:47.015316 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" event={"ID":"20bda56d-e2bc-4588-bd8b-71f9fbf07dce","Type":"ContainerStarted","Data":"d4582f2b46374411ce668b8a7e0b351cb84af827832f2ff5bbf766ab69080837"} Apr 21 06:31:49.027207 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:49.027168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" event={"ID":"20bda56d-e2bc-4588-bd8b-71f9fbf07dce","Type":"ContainerStarted","Data":"f19f4614cb7c26d7b995a46d069781eb5216e9829c8f66c6ed5c7e6a30731fc7"} Apr 21 06:31:49.028415 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:49.028392 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" event={"ID":"4ed58049-7535-4707-b734-1b452c05aa7a","Type":"ContainerStarted","Data":"c0a867bd6ce9e6da760aa98e0eea793052ddcfe00c3298ff40fdec1c76f9586d"} Apr 21 06:31:49.028494 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:49.028479 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:49.053555 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:49.053506 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-pxdpq" podStartSLOduration=1.461599134 podStartE2EDuration="3.053493372s" podCreationTimestamp="2026-04-21 06:31:46 +0000 UTC" firstStartedPulling="2026-04-21 06:31:46.600481829 +0000 UTC m=+311.252927627" lastFinishedPulling="2026-04-21 06:31:48.192376068 +0000 UTC m=+312.844821865" observedRunningTime="2026-04-21 06:31:49.052193777 +0000 UTC m=+313.704639599" watchObservedRunningTime="2026-04-21 06:31:49.053493372 +0000 UTC m=+313.705939190" Apr 21 06:31:49.069249 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:49.069196 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" podStartSLOduration=1.516294542 podStartE2EDuration="4.06917811s" podCreationTimestamp="2026-04-21 06:31:45 +0000 UTC" firstStartedPulling="2026-04-21 06:31:45.633833949 +0000 UTC m=+310.286279747" lastFinishedPulling="2026-04-21 06:31:48.186717513 +0000 UTC m=+312.839163315" observedRunningTime="2026-04-21 06:31:49.068868067 +0000 UTC m=+313.721313896" watchObservedRunningTime="2026-04-21 06:31:49.06917811 +0000 UTC m=+313.721623930" Apr 21 06:31:55.034979 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:55.034941 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-dsglc" Apr 21 06:31:59.519018 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.518978 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh"] Apr 21 06:31:59.523402 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.523382 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.525568 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.525546 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lqhdj\"" Apr 21 06:31:59.525705 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.525591 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 06:31:59.526325 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.526307 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 06:31:59.528843 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.528820 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh"] Apr 21 06:31:59.630898 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.630855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.631106 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.630918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.631106 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.630995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjb5\" (UniqueName: \"kubernetes.io/projected/68ea53f8-dc6f-4d4f-a282-6091318c7b56-kube-api-access-cbjb5\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.731973 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.731935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjb5\" (UniqueName: \"kubernetes.io/projected/68ea53f8-dc6f-4d4f-a282-6091318c7b56-kube-api-access-cbjb5\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.732173 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.732021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.732173 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.732048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.732377 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.732359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.732452 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.732430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.739558 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.739536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjb5\" (UniqueName: \"kubernetes.io/projected/68ea53f8-dc6f-4d4f-a282-6091318c7b56-kube-api-access-cbjb5\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.833342 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.833250 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:31:59.956395 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:31:59.956368 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh"] Apr 21 06:31:59.959962 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:31:59.959934 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68ea53f8_dc6f_4d4f_a282_6091318c7b56.slice/crio-6df4fc056ebe455852b08b2682ba7456de4c2a8d30e3c8b5be0c986122dc5e75 WatchSource:0}: Error finding container 6df4fc056ebe455852b08b2682ba7456de4c2a8d30e3c8b5be0c986122dc5e75: Status 404 returned error can't find the container with id 6df4fc056ebe455852b08b2682ba7456de4c2a8d30e3c8b5be0c986122dc5e75 Apr 21 06:32:00.062473 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:00.062447 2577 generic.go:358] "Generic (PLEG): container finished" podID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerID="918699be665182f40acee1437db143c00361aa8d64eef2735d0e1e6aac0c74ee" exitCode=0 Apr 21 06:32:00.062578 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:00.062523 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" event={"ID":"68ea53f8-dc6f-4d4f-a282-6091318c7b56","Type":"ContainerDied","Data":"918699be665182f40acee1437db143c00361aa8d64eef2735d0e1e6aac0c74ee"} Apr 21 06:32:00.062578 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:00.062560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" event={"ID":"68ea53f8-dc6f-4d4f-a282-6091318c7b56","Type":"ContainerStarted","Data":"6df4fc056ebe455852b08b2682ba7456de4c2a8d30e3c8b5be0c986122dc5e75"} Apr 21 06:32:03.082654 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:03.082617 2577 generic.go:358] "Generic (PLEG): container finished" podID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerID="918ccf37eea8710a690d71f97fc2a9d49bf5cd2f7ef6348b42d899eda260e628" exitCode=0 Apr 21 06:32:03.083077 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:03.082709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" event={"ID":"68ea53f8-dc6f-4d4f-a282-6091318c7b56","Type":"ContainerDied","Data":"918ccf37eea8710a690d71f97fc2a9d49bf5cd2f7ef6348b42d899eda260e628"} Apr 21 06:32:04.087602 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:04.087569 2577 generic.go:358] "Generic (PLEG): container finished" podID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerID="8bf551745b5688a83421068da055637b42bf1e2e4fcf9bb6c550e4db86e53940" exitCode=0 Apr 21 06:32:04.088015 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:04.087662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" event={"ID":"68ea53f8-dc6f-4d4f-a282-6091318c7b56","Type":"ContainerDied","Data":"8bf551745b5688a83421068da055637b42bf1e2e4fcf9bb6c550e4db86e53940"} Apr 21 06:32:05.208050 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.208026 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:32:05.381682 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.381603 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-util\") pod \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " Apr 21 06:32:05.381682 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.381647 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjb5\" (UniqueName: \"kubernetes.io/projected/68ea53f8-dc6f-4d4f-a282-6091318c7b56-kube-api-access-cbjb5\") pod \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " Apr 21 06:32:05.381682 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.381685 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-bundle\") pod \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\" (UID: \"68ea53f8-dc6f-4d4f-a282-6091318c7b56\") " Apr 21 06:32:05.382178 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.382146 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-bundle" (OuterVolumeSpecName: "bundle") pod "68ea53f8-dc6f-4d4f-a282-6091318c7b56" (UID: "68ea53f8-dc6f-4d4f-a282-6091318c7b56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:32:05.383866 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.383839 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68ea53f8-dc6f-4d4f-a282-6091318c7b56-kube-api-access-cbjb5" (OuterVolumeSpecName: "kube-api-access-cbjb5") pod "68ea53f8-dc6f-4d4f-a282-6091318c7b56" (UID: "68ea53f8-dc6f-4d4f-a282-6091318c7b56"). InnerVolumeSpecName "kube-api-access-cbjb5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:32:05.478313 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.478248 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-util" (OuterVolumeSpecName: "util") pod "68ea53f8-dc6f-4d4f-a282-6091318c7b56" (UID: "68ea53f8-dc6f-4d4f-a282-6091318c7b56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:32:05.483037 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.483014 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-util\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:32:05.483131 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.483042 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cbjb5\" (UniqueName: \"kubernetes.io/projected/68ea53f8-dc6f-4d4f-a282-6091318c7b56-kube-api-access-cbjb5\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:32:05.483131 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:05.483053 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68ea53f8-dc6f-4d4f-a282-6091318c7b56-bundle\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:32:06.096163 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:06.096126 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" event={"ID":"68ea53f8-dc6f-4d4f-a282-6091318c7b56","Type":"ContainerDied","Data":"6df4fc056ebe455852b08b2682ba7456de4c2a8d30e3c8b5be0c986122dc5e75"} Apr 21 06:32:06.096163 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:06.096161 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6df4fc056ebe455852b08b2682ba7456de4c2a8d30e3c8b5be0c986122dc5e75" Apr 21 06:32:06.096356 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:06.096174 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78exjcdh" Apr 21 06:32:57.310237 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.310195 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v"] Apr 21 06:32:57.310684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.310595 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerName="pull" Apr 21 06:32:57.310684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.310612 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerName="pull" Apr 21 06:32:57.310684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.310626 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerName="util" Apr 21 06:32:57.310684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.310633 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerName="util" Apr 21 06:32:57.310684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.310647 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerName="extract" Apr 21 06:32:57.310684 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.310656 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerName="extract" Apr 21 06:32:57.310882 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.310765 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="68ea53f8-dc6f-4d4f-a282-6091318c7b56" containerName="extract" Apr 21 06:32:57.313391 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.313376 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.315685 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.315650 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 21 06:32:57.315685 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.315652 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 06:32:57.316453 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.316434 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-xmthn\"" Apr 21 06:32:57.316570 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.316439 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 06:32:57.316570 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.316490 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 21 06:32:57.321391 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.321369 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v"] Apr 21 06:32:57.413565 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.413532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.413759 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.413571 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-cert\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.413759 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.413594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfhl2\" (UniqueName: \"kubernetes.io/projected/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-kube-api-access-lfhl2\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.514577 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.514545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.514577 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.514580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-cert\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.514797 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.514605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfhl2\" (UniqueName: \"kubernetes.io/projected/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-kube-api-access-lfhl2\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.515185 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.515163 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.517132 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.517109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-cert\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.522246 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.522225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfhl2\" (UniqueName: \"kubernetes.io/projected/c9bc2318-ac6c-4bc6-bf07-9c0d43131f87-kube-api-access-lfhl2\") pod \"kubeflow-trainer-controller-manager-7778bbcbb8-ms24v\" (UID: \"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87\") " pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.623160 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.623065 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:32:57.746137 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:57.746110 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v"] Apr 21 06:32:57.748120 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:32:57.748087 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9bc2318_ac6c_4bc6_bf07_9c0d43131f87.slice/crio-0860f777cd91fe23067cbb1b811f8e018d1907e2e8ba8bbc587580c175709007 WatchSource:0}: Error finding container 0860f777cd91fe23067cbb1b811f8e018d1907e2e8ba8bbc587580c175709007: Status 404 returned error can't find the container with id 0860f777cd91fe23067cbb1b811f8e018d1907e2e8ba8bbc587580c175709007 Apr 21 06:32:58.269689 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:32:58.269650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" event={"ID":"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87","Type":"ContainerStarted","Data":"0860f777cd91fe23067cbb1b811f8e018d1907e2e8ba8bbc587580c175709007"} Apr 21 06:33:00.280207 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:33:00.280121 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" event={"ID":"c9bc2318-ac6c-4bc6-bf07-9c0d43131f87","Type":"ContainerStarted","Data":"828596ea0aa6139725f38b0830c68fa98202fb3abada981f953723f512be4ffa"} Apr 21 06:33:00.280207 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:33:00.280179 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:33:00.295132 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:33:00.295082 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" podStartSLOduration=1.0333387 podStartE2EDuration="3.295066315s" podCreationTimestamp="2026-04-21 06:32:57 +0000 UTC" firstStartedPulling="2026-04-21 06:32:57.749849495 +0000 UTC m=+382.402295292" lastFinishedPulling="2026-04-21 06:33:00.011577104 +0000 UTC m=+384.664022907" observedRunningTime="2026-04-21 06:33:00.293804662 +0000 UTC m=+384.946250481" watchObservedRunningTime="2026-04-21 06:33:00.295066315 +0000 UTC m=+384.947512135" Apr 21 06:33:16.289521 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:33:16.289493 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-7778bbcbb8-ms24v" Apr 21 06:34:04.457313 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.457223 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bc5bc9dfb-vhsfh"] Apr 21 06:34:04.459760 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.459717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.469900 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.469870 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bc5bc9dfb-vhsfh"] Apr 21 06:34:04.563135 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.563095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-service-ca\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.563135 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.563140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-console-config\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.563343 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.563199 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-oauth-serving-cert\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.563343 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.563252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rw8\" (UniqueName: \"kubernetes.io/projected/88ec052a-9a65-4337-a735-68b80c349369-kube-api-access-j2rw8\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.563343 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.563280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-trusted-ca-bundle\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.563343 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.563301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88ec052a-9a65-4337-a735-68b80c349369-console-oauth-config\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.563343 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.563326 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88ec052a-9a65-4337-a735-68b80c349369-console-serving-cert\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.663948 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.663913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rw8\" (UniqueName: \"kubernetes.io/projected/88ec052a-9a65-4337-a735-68b80c349369-kube-api-access-j2rw8\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.664142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.663961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-trusted-ca-bundle\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.664142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.663994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88ec052a-9a65-4337-a735-68b80c349369-console-oauth-config\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.664142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.664019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88ec052a-9a65-4337-a735-68b80c349369-console-serving-cert\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.664142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.664085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-service-ca\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.664142 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.664114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-console-config\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.664393 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.664154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-oauth-serving-cert\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.664874 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.664833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-console-config\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.665059 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.665028 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-service-ca\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.665163 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.665143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-oauth-serving-cert\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.665199 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.665178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ec052a-9a65-4337-a735-68b80c349369-trusted-ca-bundle\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.666860 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.666837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88ec052a-9a65-4337-a735-68b80c349369-console-oauth-config\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.666950 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.666911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88ec052a-9a65-4337-a735-68b80c349369-console-serving-cert\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.671563 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.671541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rw8\" (UniqueName: \"kubernetes.io/projected/88ec052a-9a65-4337-a735-68b80c349369-kube-api-access-j2rw8\") pod \"console-bc5bc9dfb-vhsfh\" (UID: \"88ec052a-9a65-4337-a735-68b80c349369\") " pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.771314 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.771274 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:04.901424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:04.901394 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bc5bc9dfb-vhsfh"] Apr 21 06:34:04.904092 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:34:04.904064 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ec052a_9a65_4337_a735_68b80c349369.slice/crio-512df677cf9148638f67d49760cc9a068a01ddef3b06b5104696e775f168838c WatchSource:0}: Error finding container 512df677cf9148638f67d49760cc9a068a01ddef3b06b5104696e775f168838c: Status 404 returned error can't find the container with id 512df677cf9148638f67d49760cc9a068a01ddef3b06b5104696e775f168838c Apr 21 06:34:05.497329 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:05.497295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc5bc9dfb-vhsfh" event={"ID":"88ec052a-9a65-4337-a735-68b80c349369","Type":"ContainerStarted","Data":"d412c94a81f323243815e6d0467d3988efad37b90ce49fcb1b6eba9d3ea8880c"} Apr 21 06:34:05.497329 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:05.497330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc5bc9dfb-vhsfh" event={"ID":"88ec052a-9a65-4337-a735-68b80c349369","Type":"ContainerStarted","Data":"512df677cf9148638f67d49760cc9a068a01ddef3b06b5104696e775f168838c"} Apr 21 06:34:05.512714 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:05.512668 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bc5bc9dfb-vhsfh" podStartSLOduration=1.512653464 podStartE2EDuration="1.512653464s" podCreationTimestamp="2026-04-21 06:34:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:34:05.511847789 +0000 UTC m=+450.164293608" watchObservedRunningTime="2026-04-21 06:34:05.512653464 +0000 UTC m=+450.165099283" Apr 21 06:34:14.772240 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:14.772199 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:14.772240 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:14.772251 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:14.777013 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:14.776991 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:15.535371 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:15.535334 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bc5bc9dfb-vhsfh" Apr 21 06:34:15.575936 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:15.575904 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-769674575b-kbdbh"] Apr 21 06:34:40.599161 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.599117 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-769674575b-kbdbh" podUID="b8cf4192-47a8-48b0-bcd2-b95b68a30478" containerName="console" containerID="cri-o://edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7" gracePeriod=15 Apr 21 06:34:40.836121 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.836086 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-769674575b-kbdbh_b8cf4192-47a8-48b0-bcd2-b95b68a30478/console/0.log" Apr 21 06:34:40.836246 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.836162 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:34:40.895624 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.895531 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-oauth-config\") pod \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " Apr 21 06:34:40.895624 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.895583 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-config\") pod \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " Apr 21 06:34:40.895624 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.895618 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-oauth-serving-cert\") pod \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " Apr 21 06:34:40.895953 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.895637 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-service-ca\") pod \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " Apr 21 06:34:40.895953 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.895667 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-serving-cert\") pod \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " Apr 21 06:34:40.895953 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.895686 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-trusted-ca-bundle\") pod \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " Apr 21 06:34:40.895953 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.895707 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kkpt\" (UniqueName: \"kubernetes.io/projected/b8cf4192-47a8-48b0-bcd2-b95b68a30478-kube-api-access-5kkpt\") pod \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\" (UID: \"b8cf4192-47a8-48b0-bcd2-b95b68a30478\") " Apr 21 06:34:40.896159 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.896116 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b8cf4192-47a8-48b0-bcd2-b95b68a30478" (UID: "b8cf4192-47a8-48b0-bcd2-b95b68a30478"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:34:40.896212 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.896184 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b8cf4192-47a8-48b0-bcd2-b95b68a30478" (UID: "b8cf4192-47a8-48b0-bcd2-b95b68a30478"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:34:40.896212 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.896190 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-service-ca" (OuterVolumeSpecName: "service-ca") pod "b8cf4192-47a8-48b0-bcd2-b95b68a30478" (UID: "b8cf4192-47a8-48b0-bcd2-b95b68a30478"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:34:40.896390 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.896215 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-config" (OuterVolumeSpecName: "console-config") pod "b8cf4192-47a8-48b0-bcd2-b95b68a30478" (UID: "b8cf4192-47a8-48b0-bcd2-b95b68a30478"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:34:40.898200 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.898170 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8cf4192-47a8-48b0-bcd2-b95b68a30478-kube-api-access-5kkpt" (OuterVolumeSpecName: "kube-api-access-5kkpt") pod "b8cf4192-47a8-48b0-bcd2-b95b68a30478" (UID: "b8cf4192-47a8-48b0-bcd2-b95b68a30478"). InnerVolumeSpecName "kube-api-access-5kkpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:34:40.898200 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.898175 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b8cf4192-47a8-48b0-bcd2-b95b68a30478" (UID: "b8cf4192-47a8-48b0-bcd2-b95b68a30478"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:34:40.898346 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.898237 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b8cf4192-47a8-48b0-bcd2-b95b68a30478" (UID: "b8cf4192-47a8-48b0-bcd2-b95b68a30478"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:34:40.997386 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.997340 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-oauth-config\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:34:40.997386 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.997382 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-config\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:34:40.997386 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.997396 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-oauth-serving-cert\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:34:40.997627 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.997410 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-service-ca\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:34:40.997627 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.997422 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8cf4192-47a8-48b0-bcd2-b95b68a30478-console-serving-cert\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:34:40.997627 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.997434 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8cf4192-47a8-48b0-bcd2-b95b68a30478-trusted-ca-bundle\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:34:40.997627 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:40.997446 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5kkpt\" (UniqueName: \"kubernetes.io/projected/b8cf4192-47a8-48b0-bcd2-b95b68a30478-kube-api-access-5kkpt\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:34:41.617141 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.617117 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-769674575b-kbdbh_b8cf4192-47a8-48b0-bcd2-b95b68a30478/console/0.log" Apr 21 06:34:41.617581 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.617156 2577 generic.go:358] "Generic (PLEG): container finished" podID="b8cf4192-47a8-48b0-bcd2-b95b68a30478" containerID="edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7" exitCode=2 Apr 21 06:34:41.617581 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.617190 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769674575b-kbdbh" event={"ID":"b8cf4192-47a8-48b0-bcd2-b95b68a30478","Type":"ContainerDied","Data":"edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7"} Apr 21 06:34:41.617581 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.617216 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769674575b-kbdbh" Apr 21 06:34:41.617581 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.617229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769674575b-kbdbh" event={"ID":"b8cf4192-47a8-48b0-bcd2-b95b68a30478","Type":"ContainerDied","Data":"887ce6b587c5a842ddfed1815b58efb402c556f87301f1cf13d250a5b31cd00b"} Apr 21 06:34:41.617581 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.617246 2577 scope.go:117] "RemoveContainer" containerID="edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7" Apr 21 06:34:41.626242 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.626225 2577 scope.go:117] "RemoveContainer" containerID="edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7" Apr 21 06:34:41.626499 ip-10-0-129-55 kubenswrapper[2577]: E0421 06:34:41.626478 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7\": container with ID starting with edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7 not found: ID does not exist" containerID="edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7" Apr 21 06:34:41.626568 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.626506 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7"} err="failed to get container status \"edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7\": rpc error: code = NotFound desc = could not find container \"edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7\": container with ID starting with edd1206103f3488b852abf35246078f6530f4bdf0219fad47d6f46e4d15b02e7 not found: ID does not exist" Apr 21 06:34:41.642481 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.642444 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-769674575b-kbdbh"] Apr 21 06:34:41.650050 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.650022 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-769674575b-kbdbh"] Apr 21 06:34:41.930715 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:34:41.930641 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8cf4192-47a8-48b0-bcd2-b95b68a30478" path="/var/lib/kubelet/pods/b8cf4192-47a8-48b0-bcd2-b95b68a30478/volumes" Apr 21 06:36:35.826794 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:36:35.826711 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/1.log" Apr 21 06:36:35.828448 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:36:35.828427 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/1.log" Apr 21 06:36:35.833049 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:36:35.833028 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:36:35.834525 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:36:35.834502 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:37:55.436745 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.436699 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx"] Apr 21 06:37:55.439168 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.437021 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8cf4192-47a8-48b0-bcd2-b95b68a30478" containerName="console" Apr 21 06:37:55.439168 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.437032 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cf4192-47a8-48b0-bcd2-b95b68a30478" containerName="console" Apr 21 06:37:55.439168 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.437095 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8cf4192-47a8-48b0-bcd2-b95b68a30478" containerName="console" Apr 21 06:37:55.440086 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.440070 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:37:55.442479 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.442424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"default-dockercfg-nvv9m\"" Apr 21 06:37:55.442639 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.442519 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"kube-root-ca.crt\"" Apr 21 06:37:55.442639 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.442519 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"openshift-service-ca.crt\"" Apr 21 06:37:55.451009 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.450982 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx"] Apr 21 06:37:55.533594 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.533552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq68h\" (UniqueName: \"kubernetes.io/projected/8f03ee64-e20f-4f96-afef-7c968975a4e7-kube-api-access-tq68h\") pod \"progression-custom-config-node-0-0-zztbx\" (UID: \"8f03ee64-e20f-4f96-afef-7c968975a4e7\") " pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:37:55.634467 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.634428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq68h\" (UniqueName: \"kubernetes.io/projected/8f03ee64-e20f-4f96-afef-7c968975a4e7-kube-api-access-tq68h\") pod \"progression-custom-config-node-0-0-zztbx\" (UID: \"8f03ee64-e20f-4f96-afef-7c968975a4e7\") " pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:37:55.642135 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.642100 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq68h\" (UniqueName: \"kubernetes.io/projected/8f03ee64-e20f-4f96-afef-7c968975a4e7-kube-api-access-tq68h\") pod \"progression-custom-config-node-0-0-zztbx\" (UID: \"8f03ee64-e20f-4f96-afef-7c968975a4e7\") " pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:37:55.749761 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.749690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:37:55.879010 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.878982 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx"] Apr 21 06:37:55.881613 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:37:55.881582 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f03ee64_e20f_4f96_afef_7c968975a4e7.slice/crio-60da6948970886b97b7fa4124cf7fbc38c278cb29281d8b68a7155ff4ca18048 WatchSource:0}: Error finding container 60da6948970886b97b7fa4124cf7fbc38c278cb29281d8b68a7155ff4ca18048: Status 404 returned error can't find the container with id 60da6948970886b97b7fa4124cf7fbc38c278cb29281d8b68a7155ff4ca18048 Apr 21 06:37:55.883791 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:55.883773 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:37:56.252797 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:37:56.252754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" event={"ID":"8f03ee64-e20f-4f96-afef-7c968975a4e7","Type":"ContainerStarted","Data":"60da6948970886b97b7fa4124cf7fbc38c278cb29281d8b68a7155ff4ca18048"} Apr 21 06:39:43.668134 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:39:43.668039 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" event={"ID":"8f03ee64-e20f-4f96-afef-7c968975a4e7","Type":"ContainerStarted","Data":"f08aed21d74c58acdf0628a2cd92be405f27f353e6cf8710a24083d4407ab301"} Apr 21 06:39:43.668608 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:39:43.668139 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:39:43.684531 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:39:43.684469 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" podStartSLOduration=1.178963323 podStartE2EDuration="1m48.684452084s" podCreationTimestamp="2026-04-21 06:37:55 +0000 UTC" firstStartedPulling="2026-04-21 06:37:55.883898075 +0000 UTC m=+680.536343873" lastFinishedPulling="2026-04-21 06:39:43.389386835 +0000 UTC m=+788.041832634" observedRunningTime="2026-04-21 06:39:43.68294646 +0000 UTC m=+788.335392290" watchObservedRunningTime="2026-04-21 06:39:43.684452084 +0000 UTC m=+788.336897903" Apr 21 06:39:45.675080 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:39:45.675048 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:40:06.673001 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:06.672932 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" podUID="8f03ee64-e20f-4f96-afef-7c968975a4e7" containerName="node" probeResult="failure" output="Get \"http://10.134.0.32:28080/metrics\": dial tcp 10.134.0.32:28080: connect: connection refused" Apr 21 06:40:07.673244 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:07.673197 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" podUID="8f03ee64-e20f-4f96-afef-7c968975a4e7" containerName="node" probeResult="failure" output="Get \"http://10.134.0.32:28080/metrics\": dial tcp 10.134.0.32:28080: connect: connection refused" Apr 21 06:40:07.673614 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:07.673325 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:40:07.673852 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:07.673826 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" podUID="8f03ee64-e20f-4f96-afef-7c968975a4e7" containerName="node" probeResult="failure" output="Get \"http://10.134.0.32:28080/metrics\": dial tcp 10.134.0.32:28080: connect: connection refused" Apr 21 06:40:07.747165 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:07.747132 2577 generic.go:358] "Generic (PLEG): container finished" podID="8f03ee64-e20f-4f96-afef-7c968975a4e7" containerID="f08aed21d74c58acdf0628a2cd92be405f27f353e6cf8710a24083d4407ab301" exitCode=0 Apr 21 06:40:07.747356 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:07.747210 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" event={"ID":"8f03ee64-e20f-4f96-afef-7c968975a4e7","Type":"ContainerDied","Data":"f08aed21d74c58acdf0628a2cd92be405f27f353e6cf8710a24083d4407ab301"} Apr 21 06:40:08.877317 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:08.877295 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:40:09.009707 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:09.009626 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq68h\" (UniqueName: \"kubernetes.io/projected/8f03ee64-e20f-4f96-afef-7c968975a4e7-kube-api-access-tq68h\") pod \"8f03ee64-e20f-4f96-afef-7c968975a4e7\" (UID: \"8f03ee64-e20f-4f96-afef-7c968975a4e7\") " Apr 21 06:40:09.011972 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:09.011939 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f03ee64-e20f-4f96-afef-7c968975a4e7-kube-api-access-tq68h" (OuterVolumeSpecName: "kube-api-access-tq68h") pod "8f03ee64-e20f-4f96-afef-7c968975a4e7" (UID: "8f03ee64-e20f-4f96-afef-7c968975a4e7"). InnerVolumeSpecName "kube-api-access-tq68h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:40:09.110898 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:09.110860 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tq68h\" (UniqueName: \"kubernetes.io/projected/8f03ee64-e20f-4f96-afef-7c968975a4e7-kube-api-access-tq68h\") on node \"ip-10-0-129-55.ec2.internal\" DevicePath \"\"" Apr 21 06:40:09.754798 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:09.754760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" event={"ID":"8f03ee64-e20f-4f96-afef-7c968975a4e7","Type":"ContainerDied","Data":"60da6948970886b97b7fa4124cf7fbc38c278cb29281d8b68a7155ff4ca18048"} Apr 21 06:40:09.754798 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:09.754796 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60da6948970886b97b7fa4124cf7fbc38c278cb29281d8b68a7155ff4ca18048" Apr 21 06:40:09.755042 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:09.754808 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx" Apr 21 06:40:12.963119 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:12.963083 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx"] Apr 21 06:40:12.966844 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:12.966817 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-custom-config-node-0-0-zztbx"] Apr 21 06:40:13.930048 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:13.930018 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f03ee64-e20f-4f96-afef-7c968975a4e7" path="/var/lib/kubelet/pods/8f03ee64-e20f-4f96-afef-7c968975a4e7/volumes" Apr 21 06:40:24.139156 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:24.139124 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-7778bbcbb8-ms24v_c9bc2318-ac6c-4bc6-bf07-9c0d43131f87/manager/0.log" Apr 21 06:40:24.578327 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:24.578295 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-7778bbcbb8-ms24v_c9bc2318-ac6c-4bc6-bf07-9c0d43131f87/manager/0.log" Apr 21 06:40:25.055880 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:40:25.055842 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-7778bbcbb8-ms24v_c9bc2318-ac6c-4bc6-bf07-9c0d43131f87/manager/0.log" Apr 21 06:41:01.050592 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.050556 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lwwlk/must-gather-hwvwd"] Apr 21 06:41:01.051083 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.050906 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f03ee64-e20f-4f96-afef-7c968975a4e7" containerName="node" Apr 21 06:41:01.051083 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.050918 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f03ee64-e20f-4f96-afef-7c968975a4e7" containerName="node" Apr 21 06:41:01.051083 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.050975 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f03ee64-e20f-4f96-afef-7c968975a4e7" containerName="node" Apr 21 06:41:01.053899 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.053878 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwwlk/must-gather-hwvwd" Apr 21 06:41:01.056260 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.056234 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lwwlk\"/\"default-dockercfg-2tfc9\"" Apr 21 06:41:01.056260 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.056248 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lwwlk\"/\"kube-root-ca.crt\"" Apr 21 06:41:01.056260 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.056242 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lwwlk\"/\"openshift-service-ca.crt\"" Apr 21 06:41:01.063152 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.063121 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwwlk/must-gather-hwvwd"] Apr 21 06:41:01.146585 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.146547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7afad825-156d-4207-a3d9-ddd6381272cc-must-gather-output\") pod \"must-gather-hwvwd\" (UID: \"7afad825-156d-4207-a3d9-ddd6381272cc\") " pod="openshift-must-gather-lwwlk/must-gather-hwvwd" Apr 21 06:41:01.146585 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.146590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds892\" (UniqueName: \"kubernetes.io/projected/7afad825-156d-4207-a3d9-ddd6381272cc-kube-api-access-ds892\") pod \"must-gather-hwvwd\" (UID: \"7afad825-156d-4207-a3d9-ddd6381272cc\") " pod="openshift-must-gather-lwwlk/must-gather-hwvwd" Apr 21 06:41:01.247937 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.247899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7afad825-156d-4207-a3d9-ddd6381272cc-must-gather-output\") pod \"must-gather-hwvwd\" (UID: \"7afad825-156d-4207-a3d9-ddd6381272cc\") " pod="openshift-must-gather-lwwlk/must-gather-hwvwd" Apr 21 06:41:01.248094 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.247947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds892\" (UniqueName: \"kubernetes.io/projected/7afad825-156d-4207-a3d9-ddd6381272cc-kube-api-access-ds892\") pod \"must-gather-hwvwd\" (UID: \"7afad825-156d-4207-a3d9-ddd6381272cc\") " pod="openshift-must-gather-lwwlk/must-gather-hwvwd" Apr 21 06:41:01.248240 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.248220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7afad825-156d-4207-a3d9-ddd6381272cc-must-gather-output\") pod \"must-gather-hwvwd\" (UID: \"7afad825-156d-4207-a3d9-ddd6381272cc\") " pod="openshift-must-gather-lwwlk/must-gather-hwvwd" Apr 21 06:41:01.256130 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.256103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds892\" (UniqueName: \"kubernetes.io/projected/7afad825-156d-4207-a3d9-ddd6381272cc-kube-api-access-ds892\") pod \"must-gather-hwvwd\" (UID: \"7afad825-156d-4207-a3d9-ddd6381272cc\") " pod="openshift-must-gather-lwwlk/must-gather-hwvwd" Apr 21 06:41:01.364034 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.363943 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwwlk/must-gather-hwvwd" Apr 21 06:41:01.484138 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.484026 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwwlk/must-gather-hwvwd"] Apr 21 06:41:01.486373 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:41:01.486339 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7afad825_156d_4207_a3d9_ddd6381272cc.slice/crio-906b400a43cbb2fcdcd0236834971573f8aa343f9c7506b370ae97f354aca3cf WatchSource:0}: Error finding container 906b400a43cbb2fcdcd0236834971573f8aa343f9c7506b370ae97f354aca3cf: Status 404 returned error can't find the container with id 906b400a43cbb2fcdcd0236834971573f8aa343f9c7506b370ae97f354aca3cf Apr 21 06:41:01.925030 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:01.924995 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/must-gather-hwvwd" event={"ID":"7afad825-156d-4207-a3d9-ddd6381272cc","Type":"ContainerStarted","Data":"906b400a43cbb2fcdcd0236834971573f8aa343f9c7506b370ae97f354aca3cf"} Apr 21 06:41:02.930928 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:02.930846 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/must-gather-hwvwd" event={"ID":"7afad825-156d-4207-a3d9-ddd6381272cc","Type":"ContainerStarted","Data":"58212c2e5139f8170c3ec4b65607252c8ea93f015f144e2e5f2e95ac827b2452"} Apr 21 06:41:02.930928 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:02.930888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/must-gather-hwvwd" event={"ID":"7afad825-156d-4207-a3d9-ddd6381272cc","Type":"ContainerStarted","Data":"329067ee5bce2e7b5715c310c0d2ab2ae395d871ce8b9445411e73d23b5b06f0"} Apr 21 06:41:02.945337 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:02.945285 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lwwlk/must-gather-hwvwd" podStartSLOduration=1.210138803 podStartE2EDuration="1.945268873s" podCreationTimestamp="2026-04-21 06:41:01 +0000 UTC" firstStartedPulling="2026-04-21 06:41:01.488120232 +0000 UTC m=+866.140566029" lastFinishedPulling="2026-04-21 06:41:02.223250287 +0000 UTC m=+866.875696099" observedRunningTime="2026-04-21 06:41:02.943698754 +0000 UTC m=+867.596144574" watchObservedRunningTime="2026-04-21 06:41:02.945268873 +0000 UTC m=+867.597714693" Apr 21 06:41:03.616289 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:03.616260 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vd2jf_e87fa991-449a-49d9-be91-c70826350171/global-pull-secret-syncer/0.log" Apr 21 06:41:03.666085 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:03.666051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6hkzv_1448effd-0e95-4c37-bb1b-6de12bbf9fd9/konnectivity-agent/0.log" Apr 21 06:41:03.764077 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:03.764036 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-55.ec2.internal_8a0a94ae8a820fa7aa720c575cce5d73/haproxy/0.log" Apr 21 06:41:06.831871 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:06.831829 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-gmsbd_634bd9b2-8299-43f0-9124-eb65af43af1e/cluster-monitoring-operator/0.log" Apr 21 06:41:07.065164 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:07.065135 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8m8jk_09e05b13-b498-4dc1-8766-74242c9ea87e/node-exporter/0.log" Apr 21 06:41:07.084424 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:07.084351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8m8jk_09e05b13-b498-4dc1-8766-74242c9ea87e/kube-rbac-proxy/0.log" Apr 21 06:41:07.108271 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:07.108241 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8m8jk_09e05b13-b498-4dc1-8766-74242c9ea87e/init-textfile/0.log" Apr 21 06:41:07.463176 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:07.463105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-trw9w_3fa012fc-52ee-4eb5-91dc-c3f733196956/prometheus-operator/0.log" Apr 21 06:41:07.484786 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:07.484756 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-trw9w_3fa012fc-52ee-4eb5-91dc-c3f733196956/kube-rbac-proxy/0.log" Apr 21 06:41:07.540333 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:07.540301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-785b4dc9d7-nfpwc_c41a9151-74f0-4a24-a44f-64934f976f61/telemeter-client/0.log" Apr 21 06:41:07.559976 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:07.559946 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-785b4dc9d7-nfpwc_c41a9151-74f0-4a24-a44f-64934f976f61/reload/0.log" Apr 21 06:41:07.582752 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:07.582708 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-785b4dc9d7-nfpwc_c41a9151-74f0-4a24-a44f-64934f976f61/kube-rbac-proxy/0.log" Apr 21 06:41:08.851670 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:08.851634 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-4rhg2_40dae373-b21d-4d63-9f06-93c00008d853/networking-console-plugin/0.log" Apr 21 06:41:09.264131 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:09.264099 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/1.log" Apr 21 06:41:09.274150 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:09.274112 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fdnr4_6803da32-a76e-4d0e-916c-a12f322ff600/console-operator/2.log" Apr 21 06:41:09.612292 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:09.612220 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bc5bc9dfb-vhsfh_88ec052a-9a65-4337-a735-68b80c349369/console/0.log" Apr 21 06:41:10.017815 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.017784 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-mshlr_58c896dd-85e3-47f6-a9db-a8d9d4542bf1/volume-data-source-validator/0.log" Apr 21 06:41:10.556099 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.556061 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w"] Apr 21 06:41:10.562074 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.562041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.567334 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.567308 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w"] Apr 21 06:41:10.629890 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.629854 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rkcmp_f85b9a72-4484-46f7-bab5-6a307b7bd43f/dns/0.log" Apr 21 06:41:10.651222 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.651194 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rkcmp_f85b9a72-4484-46f7-bab5-6a307b7bd43f/kube-rbac-proxy/0.log" Apr 21 06:41:10.740567 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.740534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-lib-modules\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.740785 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.740574 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-podres\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.740785 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.740609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s965\" (UniqueName: \"kubernetes.io/projected/82cf17a6-0ed4-4574-98db-9f15b98d6139-kube-api-access-4s965\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.740785 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.740658 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-sys\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.740785 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.740699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-proc\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.758288 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.758259 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2hplw_83b7a9cf-9462-4e2a-901b-482dc68cb898/dns-node-resolver/0.log" Apr 21 06:41:10.842226 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-lib-modules\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.842480 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-lib-modules\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.842614 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-podres\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.842614 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s965\" (UniqueName: \"kubernetes.io/projected/82cf17a6-0ed4-4574-98db-9f15b98d6139-kube-api-access-4s965\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.842614 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-sys\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.842614 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-proc\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.842873 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-proc\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.842873 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842844 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-podres\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.842954 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.842892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82cf17a6-0ed4-4574-98db-9f15b98d6139-sys\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.851439 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.851412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s965\" (UniqueName: \"kubernetes.io/projected/82cf17a6-0ed4-4574-98db-9f15b98d6139-kube-api-access-4s965\") pod \"perf-node-gather-daemonset-gtj9w\" (UID: \"82cf17a6-0ed4-4574-98db-9f15b98d6139\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:10.873389 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:10.873364 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:11.009057 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:11.009030 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w"] Apr 21 06:41:11.012221 ip-10-0-129-55 kubenswrapper[2577]: W0421 06:41:11.012193 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod82cf17a6_0ed4_4574_98db_9f15b98d6139.slice/crio-3a65627e52f873f675b6c4f8ddcf3e9b12d87869733864715192aeb3e7dbf54e WatchSource:0}: Error finding container 3a65627e52f873f675b6c4f8ddcf3e9b12d87869733864715192aeb3e7dbf54e: Status 404 returned error can't find the container with id 3a65627e52f873f675b6c4f8ddcf3e9b12d87869733864715192aeb3e7dbf54e Apr 21 06:41:11.207581 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:11.207547 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gnb44_d222f49e-3ace-4fc2-9344-97a36ac9bc47/node-ca/0.log" Apr 21 06:41:11.876070 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:11.876041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5968f9cfc4-wxd8m_d743fd44-3762-47ee-9a4c-617f122ba333/router/0.log" Apr 21 06:41:11.967915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:11.967882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" event={"ID":"82cf17a6-0ed4-4574-98db-9f15b98d6139","Type":"ContainerStarted","Data":"f1408b1304be97ef760bb747c51fe9c022d576abdfb060d5c9ddc4910a6050a1"} Apr 21 06:41:11.967915 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:11.967919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" event={"ID":"82cf17a6-0ed4-4574-98db-9f15b98d6139","Type":"ContainerStarted","Data":"3a65627e52f873f675b6c4f8ddcf3e9b12d87869733864715192aeb3e7dbf54e"} Apr 21 06:41:11.968116 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:11.967942 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:11.983103 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:11.983055 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" podStartSLOduration=1.98304137 podStartE2EDuration="1.98304137s" podCreationTimestamp="2026-04-21 06:41:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:41:11.981588408 +0000 UTC m=+876.634034228" watchObservedRunningTime="2026-04-21 06:41:11.98304137 +0000 UTC m=+876.635487189" Apr 21 06:41:12.166192 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:12.166128 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2dlsg_3336a9c5-62bd-44a2-8149-ccbdebfdb50a/serve-healthcheck-canary/0.log" Apr 21 06:41:12.523876 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:12.523835 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5hfwt_fb78ba44-67d7-4a52-b661-9a1c6e9c6b38/insights-operator/1.log" Apr 21 06:41:12.524337 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:12.524028 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5hfwt_fb78ba44-67d7-4a52-b661-9a1c6e9c6b38/insights-operator/0.log" Apr 21 06:41:12.543430 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:12.543386 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2mh7j_c54eb281-5e83-4630-8888-50ac3bdb9ed7/kube-rbac-proxy/0.log" Apr 21 06:41:12.562449 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:12.562428 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2mh7j_c54eb281-5e83-4630-8888-50ac3bdb9ed7/exporter/0.log" Apr 21 06:41:12.582625 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:12.582598 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2mh7j_c54eb281-5e83-4630-8888-50ac3bdb9ed7/extractor/0.log" Apr 21 06:41:17.410533 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:17.410486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-j2sxp_504b76b3-d116-4731-aca7-01cb1970de58/kube-storage-version-migrator-operator/1.log" Apr 21 06:41:17.412337 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:17.412311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-j2sxp_504b76b3-d116-4731-aca7-01cb1970de58/kube-storage-version-migrator-operator/0.log" Apr 21 06:41:17.981890 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:17.981863 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-gtj9w" Apr 21 06:41:18.120312 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.120284 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vk77_5a9984cb-8f3e-47c2-b7d5-3612ff658e70/kube-multus-additional-cni-plugins/0.log" Apr 21 06:41:18.140027 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.139997 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vk77_5a9984cb-8f3e-47c2-b7d5-3612ff658e70/egress-router-binary-copy/0.log" Apr 21 06:41:18.159421 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.159396 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vk77_5a9984cb-8f3e-47c2-b7d5-3612ff658e70/cni-plugins/0.log" Apr 21 06:41:18.178534 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.178501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vk77_5a9984cb-8f3e-47c2-b7d5-3612ff658e70/bond-cni-plugin/0.log" Apr 21 06:41:18.197832 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.197807 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vk77_5a9984cb-8f3e-47c2-b7d5-3612ff658e70/routeoverride-cni/0.log" Apr 21 06:41:18.217270 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.217243 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vk77_5a9984cb-8f3e-47c2-b7d5-3612ff658e70/whereabouts-cni-bincopy/0.log" Apr 21 06:41:18.236564 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.236481 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7vk77_5a9984cb-8f3e-47c2-b7d5-3612ff658e70/whereabouts-cni/0.log" Apr 21 06:41:18.597978 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.597882 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n768c_7d2bbbed-1117-4391-9611-601532f34a73/kube-multus/0.log" Apr 21 06:41:18.643643 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.643603 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-276tk_14257089-c0ac-4007-81fc-ff9a9034e71b/network-metrics-daemon/0.log" Apr 21 06:41:18.661416 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:18.661392 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-276tk_14257089-c0ac-4007-81fc-ff9a9034e71b/kube-rbac-proxy/0.log" Apr 21 06:41:19.836392 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:19.836346 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-controller/0.log" Apr 21 06:41:19.853205 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:19.853173 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/0.log" Apr 21 06:41:19.862673 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:19.862634 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovn-acl-logging/1.log" Apr 21 06:41:19.887635 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:19.887600 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/kube-rbac-proxy-node/0.log" Apr 21 06:41:19.910673 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:19.910643 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 06:41:19.929467 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:19.929380 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/northd/0.log" Apr 21 06:41:19.951206 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:19.951179 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/nbdb/0.log" Apr 21 06:41:19.977713 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:19.977688 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/sbdb/0.log" Apr 21 06:41:20.139587 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:20.139556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bdm62_be794aa6-58e2-4d4d-b76c-e85f84c36d7e/ovnkube-controller/0.log" Apr 21 06:41:21.389071 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:21.389038 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-thvnj_7f77e68e-f3ad-422e-af2d-685ee3a97eaa/network-check-target-container/0.log" Apr 21 06:41:22.258894 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:22.258862 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-dwt8n_9f0218ab-e007-4a8a-a5d9-1682337a814f/iptables-alerter/0.log" Apr 21 06:41:22.857003 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:22.856922 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7mw75_45bfe8d3-3836-468f-bde5-17d1c54e53a8/tuned/0.log" Apr 21 06:41:24.434495 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:24.434460 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2g6hc_50263a7c-1596-4353-a40d-4453e307fb4f/cluster-samples-operator/0.log" Apr 21 06:41:24.450763 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:24.450709 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2g6hc_50263a7c-1596-4353-a40d-4453e307fb4f/cluster-samples-operator-watch/0.log" Apr 21 06:41:25.576481 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:25.576440 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-99g55_90669bf5-ed00-4c9d-bc20-88d224b7c071/service-ca-controller/0.log" Apr 21 06:41:25.971089 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:25.971062 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-nsw5z_1bb3c06d-f4bc-4567-9958-978a3b9398c2/csi-driver/0.log" Apr 21 06:41:26.004435 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:26.004396 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-nsw5z_1bb3c06d-f4bc-4567-9958-978a3b9398c2/csi-node-driver-registrar/0.log" Apr 21 06:41:26.038104 ip-10-0-129-55 kubenswrapper[2577]: I0421 06:41:26.038078 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-nsw5z_1bb3c06d-f4bc-4567-9958-978a3b9398c2/csi-liveness-probe/0.log"